1.4 * 10^6 km vs. 14 * 10^3 km.
[1] https://en.wikipedia.org/wiki/International_Image_Interopera...
https://openseadragon.github.io/
https://codepen.io/Kimtaro88/embed/JjgLyJg/ee2ebbe7041869032...
er, 149 million km away [0] not 43
If we got 17.5ish kW per square metre here on Earth, you'd know about it (but only briefly).
[0] https://www.esa.int/Science_Exploration/Space_Science/Solar_...
There's a diagram here, but at least some of the information there seems preliminary as they eventually launched with a black "Solar Black" heat shield coating rather than white titanium dioxide because the latter wasn't sufficiently UV-stable.
https://www.eoportal.org/satellite-missions/solar-orbiter-mi...
Not the Sun's distance from Earth, and radiant intensity at the Sun's surface.
(The comment was unclearly worded and it took me a couple of readings and review of comments to realise the intent.)
We receive about 1kw of sunlight per square meter on Earth, and earth is 149M km from the sun. From napkin math, it should rather be ~45MW/sqm on the sun to receive 1kw/sqm on Earth (surface of the sphere of radius 149M km divided by surface of the sun gives ~45000, so 1 watt from the sun becomes 1/45000 watt when it reaches the Earth)
Where am I wrong ?
It's pretty amazing that you can have a spacecraft in nearly 20x direct sunlight, permanently and still have it actually work.
Total solar power output = 4 * π * (1.5e11 [m])² * 1361 [W] = 3.85e26 W/m²
Sun's "surface" irradiance = TSPO / (4 * π * (6.96e8 [m])²) = 6.32e9 W/m²
At Solar Orbiter's perihelion, assuming the distance from the Sun's point center rather than the Sun's surface = TSPO / (4 * π * (4.2e10 [m])²) = 1.74e4 W/m².
^ Except for Earth's irradiance and the distances, these are theoretical rough values rather than observed ones because reality is messier than simplified models.
The usual sense involves integration and derivation but look at senses 2 & 4. It also means any calculation.
IIRC the Sun converts ~4.5 million tons of mass into energy every second and even then, there are objects that are trillions of times more energetic/violent. The first LIGO detection I believe converted 5 Solar masses into energy in about a second.
And 4.5 million tons of mass/second may be unimaginably huge, but the Sun is so big it can also do that constantly for literally billions and billions of years. And it's not even an especially big star!
Looking online I found this:
https://www.nasa.gov/solar-system/sounds-of-the-sun/
But it's not realistic because:
> Finally, he interpolated over the missing data and scaled the data (speeded it up a factor 42,000 to bring it into the audible human-hearing range (kHz)).
Also this:
https://www.youtube.com/watch?v=xAesteoeoF4
but it's not actually sound, it's converting EM waves into sound:
> ...recording frequency and amplitude information about these plasma waves that scientists can then play as sound waves.
So I'm really curious if the genuine sound of the sun would just be white noise, like a waterfall or rumble, or with defined frequences (hums), or if it's all so low-frequency or high-frequency or something that it isn't even audible?
The Sun's surface/photosphere is 5,772 kelvins (commonly cited as ~6000 °C), the corona is in the order of magnitude of 1 million kelvin, and the core is around 15 million kelvin.
https://x.com/AJamesMcCarthy/status/1638648459002806272
by
Andrew McCarthy: https://www.instagram.com/cosmic_background/
Jason Guenzel: https://www.instagram.com/thevastreaches/
If you want to be pedantic, every single picture ever taken with a digital camera is digitally modified. Every single image shot on film and scanned to be used on a computer is digitally modified.
Just because you can't take a photo of the sun anywhere close to this does not mean others of us cannot, and does not make their actual images of the sun not real. Using proper filters so you do not melt your equipment allows for images of the photosphere to be captured. Using the moon to filter the photosphere during an eclipse allows the corona to be seen. It's not like it's not there except during an eclipse. It's just too faint to be captured without the filter.
That's what the SRO uses a cornograph to block the photosphere at all times to be able to image the corona.
Imaging the sun is very fun and challenging, and I'd suggest you'd learn a lot from reading up on it. Whether you'd actually enjoy it is beyond the scope of this forum
> A geometrically altered image of the 2017 eclipse as an artistic element in this composition to display an otherwise invisible structure. Great care was taken to align the two atmospheric layers in a scientifically plausible way using NASA's SOHO data as a reference.
https://cosmicbackground.io/products/fusion-of-helios
I mean, take a look at some of the photographers other work...
https://cosmicbackground.io/products/tales-from-the-solar-sy...
Great. It’s not strictly a graph of photons. Zero people are using this stitched together image to perform science. Moreover, virtually every single space image intended for public consumption has been converted from UV/radio/infrared into the visible spectrum, retouched, stitched together as a composite, or experienced some other form of artistic manipulation.
Nobody cares. Nobody should care. This is a thoroughly inconsequential hill to die on and a completely pointless bit of pedantry.
I'm not affiliated, but I've been seriously debating it for a long time. The photo is a composite of the sun and the sun's heliosphere from the 2017 eclipse. One of my favorite images of the Sun.
If you view the sun with eclipse glasses, you basically see the "orange" image just with your eyes. Add the same level of filtering to a telescope or long lens on your camera, and you can capture similar image.
People may not be aware how strongly filtered it is. The PHI imager is using 6 very narrow (<<0.1 nm) passbands, all centered about one absorption line (Fe I, 317nm, as you mention). It's insanely narrowband.
From the abstract of the paper (https://arxiv.org/pdf/1903.11061) describing the instrument:
> SO/PHI measures the Zeeman effect and the Doppler shift in the Fe i 617.3 nm spectral line. To this end, the instrument carries out narrow-band imaging spectro-polarimetry using a tunable LiNbO3 Fabry-Perot etalon, while the polarisation modulation is done with liquid crystal variable retarders (LCVRs). The line and the nearby continuum are sampled at six wavelength points and the data are recorded by a 2k × 2k CMOS detector. [...] The high heat load generated through proximity to the Sun is greatly reduced by the multilayer-coated entrance windows to the two telescopes that allow less than 4% of the total sunlight to enter the instrument, most of it in a narrow wavelength band around the chosen spectral line.
(Note: the 4% figure is just the pre-filtering at the entrance window, before the even sharper filtering done by the etalon.)
So the image you see is just a reconstruction of intensity using the 6 extremely narrow filters (I'm not sure precisely how they do the reconstruction; an analogous NASA instrument called HMI uses the straight average IIRC).
So, the remarks nearby about the black body emission of the Sun, etc., are correct but not relevant to the color used. The red color is just as easily viewed as the traditional coloring used for the scalar intensity represented by this image type, kind of mnemonically related to the fact that the FeI line at 617nm is in the red wavelengths.
In writing journal papers using these images, sometimes people use the longer but techically correct "pseudo-continuum intensity image" rather than the punchier "photogram". This emphasizes that the image shown is a reconstruction of the continuum intensity.
And as you say, the other "images", including magnetic field and velocity, are reconstructed using other algorithms from these 6 wavelengths. For instance, velocity is recovered because the Fe-I absorption line's location shifts with Doppler velocity along line-of-sight.
And magnetic field is recovered due to the Zeeman effect on the line shape.
It's amazing what you can do when you have so many photons!
The sun is emitting light at roughly the spectrum curve of a (non-ideal) black body at 5778°K [1].
The 'black body' curve is the idealized electromagnetic spectral emission curve of how every body 'glows' according to temperature. [0] The peak of the sun's emission curve is around 500nm which is a blue-green, but of course it is spread out across a broad spectrum so is closer to white, and then it is differentially scattered by the atmosphere.
But these photos have no atmospheric filtering or scattering, so, perhaps the yellow-orange hue is more related to their own filters?
[0] https://en.wikipedia.org/wiki/Black-body_radiation
[1] https://physics.stackexchange.com/questions/130209/how-can-i...
This image is colored because it uses a red filter:
> The instrument collected red light with a wavelength of 617 nanometres.
One last thought, because I think it's fun. The Sun looks yellow to us on Earth because the sky is blue. Think about it.
https://en.wikipedia.org/wiki/Standard_illuminant
https://ars.els-cdn.com/content/image/1-s2.0-B97804431878650...
Did the probe revolves around sun ?
I'm guessing this is sort of equivalent to manual supersampling rather than combining adjacent (ie visually translated to the next subsquare of the photo) viewpoints? Four hours is a pretty short time for 48 million miles of distance.
Edit: well considering orbital velocity I guess they probably just zigzag'd perpendicular to the orbital plane?
I wonder if the 4m DKIST on Earth would have higher resolution photomosaic of the sun if it were used to do this one day? Probably. It's field of view is smaller but it can image features down to the high single km scale (~8km) on the photosphere.
The problem with this is that at 10km scale the features of the sun are changing far faster than at large scales. The rows of exposures' tops and bottoms would not match very well assuming a normal raster scan. The higher the resolution the smaller the timespan you have to take the full disk image. And the higher the resolution the smaller your FOV is. It's a rough situation.
I sometimes wonder if we should have probes orbiting each planet and the sun, with the most generically useful sensors, for research that isn't planned decades in advance. But I really don't know if that would be a good investment of scarce scientific resources.
Visible: https://eopro.esa.int/wp-content/uploads/2024/10/PHI_Visible...
Magnetogram: https://eopro.esa.int/wp-content/uploads/2024/10/PHI_Magneto...
Velocity map: https://eopro.esa.int/wp-content/uploads/2024/10/PHI_Velocit...
Ultraviolet: https://eopro.esa.int/wp-content/uploads/2024/10/EUI_Ultravi...
Rough edges—that's like rejecting a van Gogh painting because it has too much paint and looks lumpy[0]. Art is lumpy. The ones that are perfectly flat are inkjet prints.
https://eopro.esa.int/wp-content/uploads/2024/10/PHI_Visible...
I couldn't resist blending the visible with ultraviolet in Photoshop, here's the result: https://imgur.com/a/vRGav2d
I did a quick clean up of the hard edges, but didn't want to push pixels too much.