[HN Gopher] New camera offers ultrafast imaging at a fraction of...
___________________________________________________________________
New camera offers ultrafast imaging at a fraction of the normal
cost
Author : wglb
Score : 64 points
Date : 2023-09-15 14:16 UTC (1 days ago)
(HTM) web link (phys.org)
(TXT) w3m dump (phys.org)
| xeonmc wrote:
| So basically it's like a CRT sweeping electron beams across the
| screen
| beckerdo wrote:
| Photos or it didn't happen!
| araes wrote:
| The full paper can be found here:
| https://opg.optica.org/optica/fulltext.cfm?uri=optica-10-9-1...
|
| It has pictures of:
|
| Fig. 3. Side-view observation of laser-induced breakdown in
| distilled water https://i.imgur.com/s4NeSmP.jpg
|
| Fig. 4. Front-view imaging of cavitation dynamics
| https://i.imgur.com/GCyW6xY.jpg
|
| Fig. 5. Imaging the laser ablation of single-layer onion cells
| https://i.imgur.com/K7H0uso.jpg
|
| As well as setup and configuration
| hinkley wrote:
| The claim to have photos. They did not seem to have shared them
| :/
| numpad0 wrote:
| > He realized that rapidly changing the tilt angle of periodic
| facets on a diffraction grating, which can generate several
| replicas of the incident light traveling in different directions,
| could present a way to sweep through different spatial positions
| to gate out frames at different time points.
|
| Is this saying that light through a lenticular lens sheet comes
| out slightly early/late depending on angles, and the sheet could
| be used as an array of delay lines with different time constants?
| foota wrote:
| I don't think so, I think it's more like sweeping an image
| across multiple sensors over time using the grating (probably
| via one of the piezo optical effects?)
|
| But I'm not familiar with your terminology, so maybe that's
| what you said :)
|
| Edit: Ah, they're using a DMD. The paper is published here: htt
| ps://opg.optica.org/optica/fulltext.cfm?uri=optica-10-9-1....
| Iiuc when the DMD mirrors are switching between on and off you
| can get different diffraction patterns while they're shifting,
| and these different diffraction orders will move the image on
| the camera.
| dylan604 wrote:
| how could light ever be early?
| tzs wrote:
| Suppose you have two points, X and Y, each distance d from
| some origin O. At time 0, some light is emitted from O
| towards both X and Y. The earliest it can arrive at either of
| them is time d/c, where c is the speed of light in a vacuum.
|
| If you want the light to arrive at different times you can
| either do something to make the paths different lengths, or
| to make the speed of light different on the two paths, or
| both. You can't make the speed faster than c on either of the
| paths but you can make it slower by making the light go
| through something with an index of refraction greater than 1.
|
| Now suppose you've got some thing that you have built to
| process light, and you care about the differences in arrival
| time of the light to different parts of the thing. There's
| probably going to be some part of your thing that it makes
| the most sense to use as a reference, and then describe light
| arrival times elsewhere relative to that reference point.
|
| There's usually no particular reason to pick as the reference
| the point on your thing that light first reaches. If you do
| indeed pick a reference that is not the earliest place of
| contact with the light, then you can have light arrive
| earlier at other parts of your thing.
| dylan604 wrote:
| light either arrives exactly on time, or arrives late. what
| ever other convoluted description you want to come up with,
| light cannot be early based on your equation of d/c
| tzs wrote:
| Consider light from a distant source, sufficiently far
| away that the wavefront at your instrument is planar. You
| have a planar sensor array parallel to the incoming
| wavefront, so the wavefront arrives at all sensors at the
| same time.
|
| There are different kinds of sensors sensing different
| aspects of the incoming light. You need to combine them,
| making sure that you are combining readings that come
| from the same plane of the wavefront.
|
| But one of your sensors takes a little longer than the
| others to process and produce a reading. You either need
| to make that sensor start processing earlier than the
| other sensors, or make the other sensors start processing
| later, or add some sort of delay between the other sensor
| outputs and the thing that combines all the sensor
| outputs.
|
| If you fix this by moving that one sensor a little out of
| your sensor plane, moving it toward the light source,
| most people, including most scientists, would say that
| your fix was to make the light arrive at that sensor
| earlier, because the light in fact arrives at that sensor
| now before it arrives _at the other sensors_.
|
| Let's say that the sensors process different wavelengths
| of light. Then another adjustment you might do to get the
| outputs synced would be to fill the space between where
| light enters and the sensor plane with a gas that happens
| to have a low index of refraction for the wavelengths the
| slow sensor uses and a high index for the wavelengths the
| other sensors use.
|
| That would make the light arrive later at all the sensors
| than it would have if the gas had not been added, but it
| will less later at the slow sensor. But again nearly
| everyone would say that what you fix did was make the
| light arrive earlier at the slow sensor, because before
| the fix light arrived at the slow sensor at the same time
| it arrived _at the other sensors_ , and after the fix it
| arrives at the slow sensor before it arrives _at the
| other sensors_.
| distract8901 wrote:
| The speed of light isn't constant. Light travels slower in
| glass and water than in air. The difference in light speed in
| different materials is why refraction and lenses work.
| Additionally, light can bounce around inside a prism or a
| lens and take a longer path which takes more time.
| dylan604 wrote:
| sounds like you didn't quite think about the question.
| everything you just said would make light late. it can't be
| early though.
| Dylan16807 wrote:
| What definition of "on time" are you using, and why do
| you assume it's the only good definition?
| distract8901 wrote:
| Early and late are typically relative terms. Some photons
| absolutely can arrive _earlier than_ other, slower
| photons.
| spockz wrote:
| Sure it can. If the material being used allows one speed
| normally. And by changing something like voltage you can
| make the light go faster or take a longer round. Then you
| can speak of light being "early(Ier)" or late compared to
| the default state.
| dylan604 wrote:
| "you can make the light go faster"
|
| are you listening to the words you're saying? seriously?
| yes, we can make light go slower so that the sensor
| receives light at different times. i'm with you all the
| way to the point you make some ridiculous not thought out
| comment about making light go faster. No. Just stop. We
| cannot do that. We can only slow light down.
| dekhn wrote:
| Nobody is saying they are increasing c, everything here
| is relative. You are making many comments that are
| misinterpreting what people are saying to you.
| h1fra wrote:
| 4.8 million frames per second is wild. Currently the phantom can
| achieve I think 1.5M fps in black and white with a "postal stamp"
| resolution. I wonder what resolution this camera could output but
| I guess it's probably smaller than < 120px.
| davidhyde wrote:
| They should probably have indicated how many frames they can
| capture too.
|
| > The team created a DRUM camera with a sequence depth of seven
| frames, meaning that it captures seven frames in each short
| movie.
|
| Seems like their movies could possibly only be 7 frames in
| length so not really comparable to Phantom type cameras which
| can capture a lot more content.
|
| Still, it's great to see the mirror tech used in DLP projectors
| having other uses. Some 3D resin printers use it too but that's
| still projection I guess. I think those dlp mirror chips are
| amazing and probably fascinating to hack around with.
| h0l0cube wrote:
| > He realized that rapidly changing the tilt angle of
| periodic facets on a diffraction grating, which can generate
| several replicas of the incident light traveling in different
| directions, could present a way to sweep through different
| spatial positions to gate out frames at different time
| points.
|
| My guess is the 'sequence depth' relates to the amount of
| diffraction gates. So perhaps all that's needed is an
| ultrafast shutter to string together a long 'movie'
| billfruit wrote:
| Does it use compressed sensing?
| hgomersall wrote:
| Without having read any more than can be gleaned from the
| article, it sounds like it's using similar principles. That is,
| the image is transformed into a domain (frequency) that is more
| amenable to rapidly capturing the necessary information quickly
| without time gating directly. It's not obvious to me what the
| setup is, but this type of approach is neat.
| sp332 wrote:
| No. It's a really fancy mirror.
| hinkley wrote:
| pointing at several different sensors, or relying on the
| speed of light to time shift arrival at the sensor? I still
| can't tell from the article, but any of those would be a neat
| trick.
___________________________________________________________________
(page generated 2023-09-16 23:00 UTC)