[HN Gopher] New telescope images of Jupiter's moon Io rival thos...
       ___________________________________________________________________
        
       New telescope images of Jupiter's moon Io rival those from
       spacecraft
        
       Author : wglb
       Score  : 172 points
       Date   : 2024-06-04 01:57 UTC (21 hours ago)
        
 (HTM) web link (phys.org)
 (TXT) w3m dump (phys.org)
        
       | whimsicalism wrote:
       | The telescope photo:
       | https://scx2.b-cdn.net/gfx/news/hires/2024/glimpses-of-a-vol...
       | 
       | Stitched spacecraft photo:
       | https://science.nasa.gov/resource/high-resolution-global-vie...
       | 
       | This seems awesome but I would say 'rival those' is a bit of a
       | stretch
        
         | OldGuyInTheClub wrote:
         | I was quite taken by the ground-based photo but agree with you
         | upon comparing the links. 50 mile resolution (LBT) vs. 1.6
         | miles (Galileo) is pretty cut-and-dried. Maybe the point is
         | that LBT can do this for other objects not around Jupiter?
        
           | WillAdams wrote:
           | Exactly --- a fly-by of everything in the asteroid belt for
           | example would be a budget-buster --- what is there in the
           | belt which it would be interesting to have photos of?
        
             | dylan604 wrote:
             | My high score!! Or we could see which asteroid Han Solo is
             | hiding the Millennium Falcon in. Or we could see all of the
             | illegal mining operations by those pesky guys from Plural
             | Zed Alpha Nine Nine.
             | 
             | We don't know what we won't see until we don't see it.
        
         | max-ibel wrote:
         | I agree; I'm thinking that's it's useful to be able to get
         | frequent snapshots of the whole moon at this resolution (if you
         | are interested in time lapse to track volcanic activity for
         | instance). The satellite-based stitched photo probably took a
         | long time to collect.
         | 
         | Also, without knowing details, I suspect you can improve the
         | LBT images as the system matures, but as you say, probably not
         | at the resolution the satellite provides.
        
         | boffinAudio wrote:
         | It depends what the parameters for rivalry are .. a ground-
         | based telescope can take multiple photo's, perhaps even
         | thousands, at very low cost - although the resolution may not
         | be comparable, this doesn't discount the value of the _data_ to
         | the scientists who are obtaining it at _a far greater reduction
         | of cost and effort_ than those who rely on space-based
         | instruments. Remember, we don 't have a permanent instrument
         | stationed at Io - these spacecraft are doing fly-by's and thus
         | have a limited window of opportunity.
         | 
         | So, while the resolution may be great eye-candy, the
         | consistency of the data over time is vastly different. "Higher
         | resolution" does not always mean "better science", especially
         | if its a one-shot compared to thousands of data-samples...
        
         | rbanffy wrote:
         | It rivals images from spacecraft in Earth orbit - as in Hubble
         | or Webb. It won't rival images from cameras sent to the near
         | vicinity of the targets, but those can take decades from the
         | moment you decide to take a picture to the time you take it,
         | unless you already decided you'd take some pictures from that
         | general location decades prior.
        
           | shiroiushi wrote:
           | >It rivals images from spacecraft _in Earth orbit_ - as in
           | Hubble or Webb. (emphasis mine)
           | 
           | It seems like this critical detail was left out of the
           | headline.
        
             | rbanffy wrote:
             | Still technically correct, the best kind of correct. I
             | wonder how much cheaper are land-based telescopes over the
             | lifetime of instruments such as the Hubble or the Webb.
             | 
             | Also, a huge shame the Overwhelmingly Large Telescope was
             | cancelled. We need more creative names for those.
        
               | kloch wrote:
               | Angular resolution through the Atmosphere has been solved
               | with adaptive optics and advanced mathematical
               | techniques.
               | 
               | Unfortunately, nothing can remove the temperature of the
               | atmosphere (which affects infrared imaging), or the
               | absorption of many wavelength bands.
        
               | alfiopuglisi wrote:
               | > I wonder how much cheaper are land-based telescopes
               | over the lifetime of instruments such as the Hubble or
               | the Webb.
               | 
               | 10-100 times cheaper. An LBT night is around $50k-100k,
               | which over 10 years corresponds to $300 millions. JWST
               | total budget is about $10 billions.
               | 
               | True, JWST can operate close to 24/7. On the other hand,
               | land-based telescopes are under constant refurbishment
               | and upgrades, and they become more powerful over time.
        
           | WrongAssumption wrote:
           | Doesn't change your point, as I believe you are mainly
           | referring to distance from the object being observed. But
           | Webb orbits the sun.
        
         | holoduke wrote:
         | That telescope image is the current highest res. Not from the
         | new telescope. That is yet to be released.
        
           | petesergeant wrote:
           | Are you sure? The image is captioned:
           | 
           | "Jupiter moon Io, imaged by SHARK-VIS on Jan. 10, 2024. This
           | is the highest resolution image of Io ever obtained by an
           | Earth-based telescope"
        
       | wthomp wrote:
       | I visited the Large binocular telescope just a month or two ago.
       | A very impressive facility, and one can only imagine the image
       | quality if they were captured using both mirrors coherently.
        
       | eternauta3k wrote:
       | For those interested in the telescope, I highly recommend this
       | Omega Tau episode: https://omegataupodcast.net/111-optical-
       | astronomy-and-the-la...
        
       | ggm wrote:
       | You would think an article comparing images would.. well would
       | show you the two images side-by-side.
       | 
       | From the post(s) below It's impressive but its definitely lower
       | res.
       | 
       | Over the life of managing telescopes, is it actually cheaper than
       | a craft in orbit?
        
         | pulvinar wrote:
         | Yes, the LBT could theoretically do 0.006 arcsecond resolution
         | in binocular configuration (22.8 meter aperture), which is
         | about 14 feet at Jupiter distance, or 10 times worse at best.
         | But then it costs 10 times less, and can see the whole universe
         | at that resolution.
        
           | dakr wrote:
           | Presumably you meant to say 14 miles, not 14 feet. Also,
           | since the adaptive optic system acts on the near-IR light,
           | let's shift the 500nm used to calculate the Rayleigh
           | criterion to at least 1 micron, which doubles the limiting
           | resolution to around 0.011 arcsec.
        
             | SiempreViernes wrote:
             | Close, 16 mas in R: https://sites.google.com/inaf.it/shark-
             | vis/home
        
             | pulvinar wrote:
             | Yes, sorry, 14 miles (a wishful typo!)
        
       | lokimedes wrote:
       | Somebody should make a cave painting of these worlds, you know,
       | just in case...
        
         | yayr wrote:
         | well, we've already buried all of github in the arktic,
         | probably should do that with some llms as well
        
           | CamperBob2 wrote:
           | Rod Serling beat us to it, I think:
           | https://www.imdb.com/title/tt0734669/
        
       | yayr wrote:
       | it looks like for around 1 BTC you could get one full night
       | access to this amazing instrument :-)
       | 
       | (you'll probably have to convert it to cash though before)
       | https://www.lbto.org/lbt-access/
        
       | ItsBob wrote:
       | Could they use the high-res orbiter photos, and the lower-but-
       | still-really-good ground-based photos and use some sort of AI
       | algo to enhance the ground-based ones?
       | 
       | The idea being that they have high-res reference photos that are
       | a one-shot deal but can take regular earth-based ones auto-
       | enhance them from now on.
       | 
       | It could then show changes over time in high res?
       | 
       | I'm showing my limitations here, obviously, but I know what I
       | mean... it makes sense in my head :)
        
         | sandos wrote:
         | Sure, super-resolution exists in various forms and this is one.
         | But this is an example of not actually adding any scientific
         | information to the photo: extraplating data like that will
         | never yield anything new or unexpected, so...
        
         | dylan604 wrote:
         | But for what purpose would an AI generative assisted image
         | actually do for science? This is an issue I have with the gung-
         | ho AI crowd that thinks AI should be used for anything and
         | everything all the time.
         | 
         | Even if you trained the model against the most detailed images
         | available. That data was a mere snapshot of the exact time it
         | was taken which in some cases is decades old. If things are
         | actually changing on these bodies, then using that stale data
         | to update current images would actually be damaging to science
         | as it would be attempting to make the current look like the
         | old. No! We need to see what it looks like now for the
         | comparisons.
         | 
         | Enhance! It can only go so far. Otherwise, you're just a low-
         | rent Hollywood SFX team generating new worlds for whatever
         | space opera you weren't hired to work on.
        
       | SiempreViernes wrote:
       | An Italian instrument apparently, built by the Rome observatory:
       | https://sites.google.com/inaf.it/shark-vis/home
        
       | jiggawatts wrote:
       | <moved>
        
         | perihelions wrote:
         | Wrong thread!
        
       | ricksunny wrote:
       | Sometimes (usually?) the tool is more interesting than the
       | science it enables.
        
       | bhouston wrote:
       | I would expect that the James Webb telescope could create even
       | better images of Io if they pointed it there? As long as it could
       | focus range to look at nearby objects...
        
         | mikepurvis wrote:
         | Isn't JWST mostly about infrared though? That's the reason it
         | has to be at L2, because those measurements are so much more
         | sensitive to interference.
         | 
         | These new Io images are in the visible spectrum, so it might be
         | more apt to compare it to Hubble.
        
           | BurningFrog wrote:
           | True, but infrared images should have value in addition to
           | the visible light images.
        
         | itishappy wrote:
         | See for yourself:
         | 
         | https://news.berkeley.edu/2023/07/27/james-webb-space-telesc...
        
           | dylan604 wrote:
           | Expecting "better" from JWST compared to the image from TFA
           | or even Hubble is definitely a misunderstanding of the
           | differences between observation platforms. Just because the
           | JWST mirror is larger than the Hubble's does not mean it will
           | produce a "better" image as they are looking at different
           | frequencies of light. Thinking that JWST will produce the
           | same type of image with more detail/resolution is an
           | incorrect way of thinking of the JWST's purpose.
        
       | VikingCoder wrote:
       | If I'm doing my math correctly, Io covers about 0.06% as many
       | degrees of our vision from Earth as the moon does. (I'm not good
       | at this math, but I'm trying.)
       | 
       | Io Diameter 2263.8 miles
       | 
       | Jupiter Distance to Earth 444000000 miles
       | 
       | Perp / Base 0.000005098648649
       | 
       | Radians 0.000005098648649
       | 
       | Degrees 0.0002922792219
       | 
       | Arc Seconds 1.052205199
       | 
       | ===
       | 
       | Moon Diameter 2159.1 miles
       | 
       | Moon Distance to Earth 238900 miles
       | 
       | Perp / Base 0.009037672666
       | 
       | Radians 0.009037426614
       | 
       | Degrees 0.5180690416
       | 
       | Arc Seconds 1865.04855
        
         | IncreasePosts wrote:
         | The moon and Io are roughly the same diameter, so you can just
         | divide Io's distance by the moon's distance to get the ratio of
         | their perceived size without futzing with angles or geometry.
        
           | VikingCoder wrote:
           | Good point! With that shortcut, yeah, it's about 0.05% of the
           | apparent diameter. So the math seems to check out.
        
       | jcims wrote:
       | IO's diameter ~3600km
       | 
       | Avg distance to earth ~628m km
       | 
       | Apparent diameter is ~5 microradians or ~1 arcsecond
       | 
       | Similar to imaging a marble 5mm in diameter from 1km away.
       | 
       | Betelgeuse is ~1.2 billion km in diameter (for now, lol)
       | 
       | It's 642 light years away.
       | 
       | It's apparent diameter is .2 microradians, or approximately a red
       | blood cell from 35m away.
       | 
       | Space is big. Things are small.
        
       | minimalc wrote:
       | I work for the company (Oxford Instruments Andor) that produces
       | the cameras for this telescope:
       | https://sites.google.com/inaf.it/shark-vis/instrument/detect... A
       | great achievement!
       | 
       | It's very exciting to be a (small) part of this, happy to answer
       | any camera software questions (can't speak for the observatory's
       | software though as I haven't seen it)
        
         | hindsightbias wrote:
         | Is this installed on one of the telescopes or integrated
         | between both? I read about the LBT once and it seemed some
         | instruments were on one and some were integrated. I assume it's
         | used as a mono and binoc platform, depending.
        
           | minimalc wrote:
           | I'm not quite sure about their exact optical setup, I know
           | the Zyla's are used in the shark-vis instrument[0]. I would
           | guess from their article that one Zyla is dedicated to
           | adaptive optics and one for imaging.
           | 
           | [0] https://sites.google.com/inaf.it/shark-
           | vis/instrument/detect...
        
           | alfiopuglisi wrote:
           | > I assume it's used as a mono and binoc platform, depending.
           | 
           | Correct, it depends on the observation. Both sides have
           | adaptive optics correction, but they work independently. This
           | particular instrument (SHARK-VIS) is mounted on the "right"
           | side, while SHARK-NIR is on the "left" side.
        
         | GrantMoyer wrote:
         | Awesome work!
         | 
         | Are the cameras similar to what's in a consumer digital camera,
         | that is, a single image sensor behind a bayer filer and a lens?
         | Or does it use some other configuration, like an array of image
         | sensors?
         | 
         | And does sensor readout work similarly to a consumer camera,
         | sequentially reading out rows of sensor data? Is there any cool
         | software processing during the capture, like decovolution?
        
           | minimalc wrote:
           | > Are the cameras similar to what's in a consumer digital
           | camera, that is, a single image sensor behind a bayer filer
           | and a lens?
           | 
           | Yes they're quite similar to consumer camera sensors, our
           | sensors are usually from high quality production bins. We
           | advertise this quality as "scientific CMOS" (sCMOS) to help
           | highlight this. Consumer sensors can have a significant
           | number of sensor defects which can be corrected so they
           | aren't noticeable in casual photographs, but these defects
           | are very detrimental for scientific imaging where quality is
           | paramount. Another big difference is the noise and quantum
           | efficiency characteristics of the sensor which is another key
           | requirement for scientific instruments.
           | 
           | We don't supply lens', I think the logic is that scientific
           | customer's know exactly what kind of optical setup they want
           | so most customer's would tend to use their own optical
           | equipment or buy it in.
           | 
           | Our camera's are monochrome (scientific cameras tend to care
           | more about raw resolution than having a smaller res with
           | bayer layer) so customers typically use different
           | color/wavelength filters to get what they want and process
           | them into true color images later if needed.
           | 
           | > Or does it use some other configuration, like an array of
           | image sensors?
           | 
           | This particular camera, the Zyla has just one sensor. Though
           | it is a little unique in our portfolio, in that the sensor
           | can be read out from both halves simultaneously in various
           | patterns. If your interested in the hardware we provide lots
           | of info in our hardware manual: https://andor.oxinst.com/down
           | loads/uploads/Zyla_hardware_use... I don't think we offer
           | multi-sensor solutions, though I could be wrong.
           | 
           | > And does sensor readout work similarly to a consumer
           | camera, sequentially reading out rows of sensor data?
           | 
           | Yes, there are two electronic shuttering modes we offer:
           | rolling and global. Rolling takes a sequential row by row
           | readout, and global does a readout of the entire sensor. The
           | camera's used by the observator can only do rolling, but we
           | have other Zyla models which also do global. There can be
           | tradeoffs in choosing which one to use, typically framerate,
           | noise and image distortion are the key factors in choosing.
           | Global is available on some high end consumer cameras, but
           | generally most consumer sensors will do rolling. Though this
           | may have changed since I last looked.
           | 
           | > Is there any cool software processing during the capture,
           | like decovolution?
           | 
           | In the camera side of the company, we try to leave the image
           | as clean and raw as possible. We perform correction
           | processing during acquisition on the camera; as high quality
           | as the bins are, you still have to correct and characterize
           | for various things to get the best performance in a
           | scientific scenario.
           | 
           | In the applications side of the company we do all kinds of
           | image processing: deconvolution (this is a big deal in the
           | confocal microscopy world, we have our own patented
           | deconvolution method: srrf-stream) https://fusion-benchtop-
           | software-guide.scrollhelp.site/fusio..., AI analysis, 3d/4d
           | imaging (https://imaris.oxinst.com/). Probably lots more I
           | don't know about (I'm on the camera side).
        
         | drewrv wrote:
         | How does the "compensation for atmospheric turbulence" work? It
         | honestly sounds impossible, like those tv shows where the
         | detective "enhances" a blurry photo.
        
           | alfiopuglisi wrote:
           | It's a technological tour-de-force involving deformable
           | mirrors that change shape every millisecond, cameras able to
           | count every incoming photon, and special computers designed
           | to calculate the next correction within microseconds. As
           | usual wikipedia has an introduction:
           | https://en.wikipedia.org/wiki/Adaptive_optics
           | 
           | Or try: https://andor.oxinst.com/learning/view/article/introd
           | uction-...
        
           | minimalc wrote:
           | Sorry I'm not a big expert in the field of optics, but I am
           | aware of our cameras being used to perform adaptive optics
           | and lucky imaging.
           | 
           | Adaptive optics in particular requires very fast framerates
           | and low latency to make rapid adjustments to the mirror's
           | shape to compensate for the constantly changing atmosphere.
           | It's really amazing that it's possible at all! I believe this
           | is the method used here, though I can't say with certainty.
           | 
           | Lucky imaging is more akin to a brute force method, where you
           | acquire lots and lots of images quickly and process the best
           | ones when the atmosphere was being particularly cooperative
           | at the time and not distorting the image very much.
           | 
           | Again, there are lots of experts out there on the topic, this
           | is just my simple view into it.
        
           | dekhn wrote:
           | The replies already posted are quite good. Let me explain it
           | a different way:
           | 
           | When light passes through the atmosphere, it undergoes a
           | convolution known as a point spread function (think of it as
           | convolving the signal with a 2D gaussian that spreads the
           | intensity out to neighboring pixels). If we know that PSF
           | specific details, we can deconvolve the image, either
           | computationally, or by modifying the mirror in real time.
           | 
           | From my understanding, you can project a laser into the
           | atmosphere, where it gets affected by the PSF. When you look
           | at that laser projection, you can find the PSF (because you
           | know the input shape of the laser, and what it looks like
           | after being affected by the PSF), and therefore use that in
           | real time to deconvolve the astronomic images you are
           | collecting.
           | 
           | This process can be done so quickly it can adapt to immediate
           | changes in the atmosphere (turbulence). "Enhance" is
           | definitely a thing- it's widely used in both telescopes and
           | microscopes (and if you had the right priors for a blurry
           | photo, you could do it there too).
           | 
           | I think this is a relatively simple read:
           | https://en.wikipedia.org/wiki/Laser_guide_star along with
           | https://www.llnl.gov/article/44936/guide-star-leads-
           | sharper-...
        
       | bloopernova wrote:
       | The ever-changing surface of so many planets leads me to wish we
       | had satellites in orbit around as much as possible. I'd love to
       | read the "weather report" for Io or Titan!
        
       ___________________________________________________________________
       (page generated 2024-06-04 23:01 UTC)