[HN Gopher] TSMC bets on unorthodox optical tech
       ___________________________________________________________________
        
       TSMC bets on unorthodox optical tech
        
       Author : Rohitcss
       Score  : 94 points
       Date   : 2025-05-26 17:15 UTC (5 hours ago)
        
 (HTM) web link (spectrum.ieee.org)
 (TXT) w3m dump (spectrum.ieee.org)
        
       | Liftyee wrote:
       | As I understand it (from designing high-speed electronics), the
       | major limitations to data/clock rates in copper are signal
       | integrity issues. Unwanted electromagnetic interactions all
       | degrade your signal. Optics is definitely a way around this, but
       | I wonder if/when it will ever hit similar limits.
        
         | moelf wrote:
         | luckily photons are boson (if we ever pushes things to this
         | level of extreme)
        
           | Taek wrote:
           | This comment appears insightful but I have no idea what it
           | means. Can someone elaborate?
        
             | scheme271 wrote:
             | Electrons are fermions which means that two electrons can't
             | occupy the same quantum state (Pauli exclusion principle).
             | Bosons don't have the limit so I believe that implies that
             | you can have stronger signals at the low end since you can
             | have multiple photons conveying or storing the same
             | information.
        
             | cycomanic wrote:
             | What the previous poster is implying is that electrons
             | interact much more strongly than photons. Hence electrons
             | are very good for processing (e.g. building a transistor),
             | while photons are very good for information transfer. This
             | is also a reason why much of the traditional "optical
             | computer" research was fundamentally flawed, just from
             | first principles one could estimate that power requirements
             | are prohibitive.
        
             | xeonmc wrote:
             | Fermions can "hit each other" while bosons "passes right
             | through each other".
             | 
             | (Strong emphasis on the looseness of the scare quotes.)
        
         | EA-3167 wrote:
         | The energy densities required for photon-photon interactions
         | are so far beyond anything we need to worry about that it's a
         | non-issue. Photons also aren't going to just ignore local
         | potential barriers and tunnel at the energy levels and scales
         | involved in foreseeable chip designs either.
        
         | notepad0x90 wrote:
         | isn't attenuation also an issue with copper? maybe with small
         | electronics it is negligible given the right amps? in other
         | words, with with no interference, electrons will face impedance
         | and start losing information.
        
           | to11mtm wrote:
           | Attenuation is going to be an issue for any signal, but in my
           | experience Fiber can go for many miles without a repeater
           | whereas something like Coax you're going one to two orders of
           | magnitude less. [0]
           | 
           | [0] - Mind you, some of that for Coax is due to other issues
           | around CTB and/or the challenge that in Coax, you've got many
           | frequencies running through alongside each frequency having
           | different attenuation per 100 foot...
        
           | bgnn wrote:
           | This is the main mechanism of interference anyhow, called
           | inter-symbol-interference.
        
         | lo0dot0 wrote:
         | Optics also have signal integrity issues. In practice OSNR and
         | SNR limit optics. Cutting the fiber still breaks it. Small
         | vibrations also affect the signal's phase.
        
           | cycomanic wrote:
           | Phase variations will not introduce any issues here, they
           | most certainly are talking about intensity modulation. You
           | can't really (easily) do coherent modulation using incoherent
           | light sources like leds.
           | 
           | SNR is obviously an issue for any communication system,
           | however fiber attenuation is orders of magnitude lower than
           | coax.
           | 
           | The bigger issues in this case would be mode-dispersion,
           | considering that they are going through "imaging" fibres,
           | i.e. different spatial components of the light walking off to
           | each other causing temporal spread of the pulses until they
           | overlap and you can't distinguish 1's and 0's.
        
       | m3kw9 wrote:
       | If each cable is 10gb/s and uses 1 pixel to convert into
       | electrical signals, would that mean they need a 10 giga frame per
       | second sensor?
        
         | cpldcpu wrote:
         | I think that's just a simplifying example. They would most
         | likely not use an image sensor, but a photodetector with a
         | broadband amplifier.
        
         | lo0dot0 wrote:
         | No, not necessarily. If you can distinguish different amplitude
         | levels you can do better. For example four amplitude modulation
         | (4AM) carries two bits per symbol. There is also the option to
         | use coherent optics, which can detect phase, and carry
         | additional information in the phase.
        
       | amelius wrote:
       | > The transmitter acts like a miniature display screen and the
       | detector like a camera.
       | 
       | So if I'm streaming a movie, it could be that the video is
       | actually literally visible inside the datacenter?
        
         | tehjoker wrote:
         | maybe a low res bit packed binary motion picture thats
         | uncompressed
        
         | tails4e wrote:
         | No, this is just an a analogy. The reality is the data is
         | heavily modulated, and also the video is encoded so at no point
         | would something that visually looks like an image be visible in
         | the fibre.
        
         | lo0dot0 wrote:
         | Obviously this is not how video compression and packets work
         | but for the sake of the argument consider the following. The
         | article speaks of a 300 fiber cable. A one bit per pixel square
         | image with approx. 300 pixels is 17x17 in size. Not your
         | typical video resolution.
        
           | fecal_henge wrote:
           | Not your typical frame rate either.
        
       | speedbird wrote:
       | Headline seems misleading. They're buildings detectors for
       | someone, not 'betting' on it.
        
       | ls612 wrote:
       | Forgive the noob question but what stops us from making optical
       | transistors?
        
         | qwezxcrty wrote:
         | I think the most fundamental reason is that there is no
         | efficient enough nonlinearity at optical frequencies. So two
         | beams(or frequencies in some implementation) tends not to
         | affect each other in common materials, unless you have a very
         | strong source (>1 W) so the current demonstrations for all-
         | optical switching are mostly using pulsed sources.
        
           | ls612 wrote:
           | I wonder if considerably more engineering and research effort
           | will be applied here when we reach the limit of what silicon
           | and electrons can do.
        
             | cycomanic wrote:
             | No this is not an engineering issue, it's a problem of
             | fundamental physics. Photons don't interact easily. That
             | doesn't mean there are not specialised applications where
             | optical processing can make sense, e.g. a matrix
             | multiplication is really just a more complex lens so it's
             | become very popular to make ML accelerators based on this.
        
         | elorant wrote:
         | Photons are way more difficult to control than electrons. They
         | don't interact with each other.
        
       | qwezxcrty wrote:
       | Not an expert in communications. Would the SerDes be the new
       | bottleneck in the approach? I imagine there is a reason for
       | serial interfaces dominating over the parallel ones, maybe timing
       | skew between lanes, how can this be addressed in this massive
       | parallel optical parallel interface?
        
         | to11mtm wrote:
         | > timing skew between lanes
         | 
         | That's a big part of it. I remember in the Early Pentium 4
         | days, starting to see a lot more visible 'squiggles' on PCB
         | traces on motherboards; the squiggles essentially being a case
         | of 'these lines need more length to be about as long as the
         | other lines and not skew timing'
         | 
         | In the case of what the article is describing, I'm imagining a
         | sort of 'harness cable' that has a connector on each end for
         | all the fibers, and the fibers in the cable itself are all the
         | same length, there wouldn't be a skew timing issue. (Instead,
         | you worry about bend radius limitations.)
         | 
         | > Would the SerDes be the new bottleneck in the approach
         | 
         | I'd think yes, but at the same time in my head I can't really
         | decide whether it's a harder problem than normal mux/demux.
        
         | fecal_henge wrote:
         | SerDes is already frequently parallelised. The difference is
         | you never expect the edges or even the entire bits to arrive at
         | the same time. You design your systems to recover timing per
         | link so the skew doesnt become the constraint on the line rate.
        
           | bgnn wrote:
           | one can implement SerDes at any point of the electro-optical
           | boundary. For example, if we have 1 Tbps incoming NRZ data
           | from the fiber, and the CMOS technology at hand only allows
           | 10 GHz clock speed for the slicers, one can have 100x
           | receivers (photodiode, TIA, slicer), or 1x photodiode, 100x
           | TIA + slicer, or 1x photodiode + TIA and 100x slicers. The
           | most common id the last one, and it spits out 100x parallel
           | data.
           | 
           | Things get interesting if the losses are high and there needs
           | to be a DFE. This limits speed a lot, but then copper
           | solutions moved to sending multi-bit symbols (PAM 3,
           | 4,5,6,8,16.. ) which can also be done in optical domain. One
           | can even send multiple wavelengths in optical domain, so
           | there are ways to boost the baud rate without requiring high
           | clock frequencies.
        
         | waterheater wrote:
         | >serial interfaces dominating over the parallel ones
         | 
         | Semi-accurate. For example, PCIe remains dominant in computing.
         | PCIe is technically a serial protocol, as new versions of PCIe
         | (7.0 is releasing soon) increase the serial transmission rate.
         | However, PCIe is also parallel-wise scalable based on
         | performance needs through "lanes", where one lane is a total of
         | four wires, arranged as two differential pairs, with one pair
         | for receiving (RX) and one for transmitting (TX).
         | 
         | PCIe scales up to 16 lanes, so a PCIe x16 interface will have
         | 64 wires forming 32 differential pairs. When routing PCIe
         | traces, the length of all differential pairs must be within
         | <100 mils of each other (I believe; it's been about 10 years
         | since I last read the spec). That's to address the "timing skew
         | between lanes" you mention, and DRCs in the PCB design software
         | will ensure the trace length skew requirement is respected.
         | 
         | >how can this be addressed in this massive parallel optical
         | parallel interface?
         | 
         | From a hardware perspective, reserve a few "pixels" of the
         | story's MicroLED transmitter array for link control, not for
         | data transfer. Examples might be a clock or a data frame
         | synchronization signal. From the software side, design a
         | communication protocol which negotiates a stable connection
         | between the endpoints and incorporates checksums.
         | 
         | Abstractly, the serial vs. parallel dynamic shifts as
         | technology advances. Raising clock rates to shove more data
         | down the line faster (serial improvement) works to a point, but
         | you'll eventually hit the limits of your current technology.
         | Still need more bandwidth? Just add more lines to meet your
         | needs (parallel improvement). Eventually the technology
         | improves, and the dynamic continues. A perfect example of that
         | is PCIe.
        
         | cycomanic wrote:
         | They are doing 10 Gb/s over each fibre, to get to 10 Gb/s you
         | have already undergone a parallel -> serial conversion in
         | electronics (clock rates of your asics/fpgas are much lower),
         | to increase the serial rate is in fact the bottleneck. Where
         | the actual optimum serial rate is highly depends on the cost of
         | each transceiver, e.g. long haul optical links operate at up to
         | 1 Tb/s serial rates while datacenter interconnects are 10-25G
         | serial AFAIK.
        
       | cycomanic wrote:
       | That article is really low on details and mixes up a lot of
       | things. It compares microleds to traditional WDM fiber
       | transmission systems with edge emitting DFB lasers and ECLs, but
       | in datacentre interconnects there's plenty of optical links
       | already and they use VCSELs (vertical cavity surface emitting
       | lasers), which are much cheaper to manufacture. People also have
       | been putting these into arrays and coupling to multi-core fiber.
       | The difficulty here is almost always packaging, i.e. coupling the
       | laser. I'm not sure why microleds would be better.
       | 
       | Also transmitting 10 Gb/s with a led seems challenging. The
       | bandwidth of an incoherent led is large, so are they doing
       | significant DSP (which costs money and energy and introduces
       | latency) or are they restricting themselves to very short (10s of
       | m) links?
        
         | tehjoker wrote:
         | short links it's in the article
        
           | cycomanic wrote:
           | Ah I missed the 10m reference there. I'm not sure it makes
           | more sense though. Typical intra-datacenter connections are
           | 10s-100s of meters and use VCSELs, so introducing microleds
           | just for the very short links instead of just parallelising
           | the VCSEL connections (which is being done already)? If they
           | could actually replace the VCSEL I would sort of see the
           | point.
        
         | qwezxcrty wrote:
         | I guess they are doing direct modulated IMDD for each link so
         | the DSP burden is not related to the coherence of diodes? Also
         | indeed very short reach in the article.
        
           | cycomanic wrote:
           | The problem with both leds and imaging fibres is that modal
           | dispersion is massive and completely destroys your signal
           | after only a few meters of propagation. So unless you do MMSE
           | (which I assume would be cost prohibitive), you really can
           | only go a few meters. IMDD doesn't really make a difference
           | here.
        
       | albertzeyer wrote:
       | There is also optical neuromorphic computing, as an alternative
       | to electronic neuromorphic computing like memristors. It's an
       | fascinating field, where you use optical signals to perform
       | analog computing. For example:
       | 
       | https://www.nature.com/articles/s41566-020-00754-y
       | 
       | https://www.nature.com/articles/s44172-022-00024-5
       | 
       | As far as I understood, you can only compute quite small neural
       | networks until the noise signal gets too large, and also only a
       | very limited set of computations works well in photonics.
        
         | cycomanic wrote:
         | The issue with optical neuromorphic computing is that the field
         | has been doing the easy part, i.e. the matrix multiplication.
         | We have known for decades that imaging/interference networks
         | can do matrix operations in a massively parallel fashion. The
         | problem is the nonlinear activation function between your
         | layers. People have largely been ignoring this, or just
         | converted back to electrical (now you are limited again by the
         | cost/bandwidth of the electronics).
        
           | seventytwo wrote:
           | Seems hard to imagine there's not some non-linear optical
           | property they could take advantage of
        
             | cycomanic wrote:
             | The problem is intensity/power, as discussed previously
             | photon-photon interactions are weak, so you need very high
             | intensities to get a reasonable nonlinear response. The
             | issue is, that optical matrix operations work by spreading
             | out the light over many parallel paths, i.e. reducing the
             | intensity in each path. There might be some clever ways to
             | overcome this, but so far everyone has avoided that
             | problem. They said we did "optical deep learning" what they
             | really did was an optical matrix multiplication, but saying
             | that would not have resulted in a Nature publication.
        
       ___________________________________________________________________
       (page generated 2025-05-26 23:00 UTC)