[HN Gopher] Displayport: A Better Video Interface
       ___________________________________________________________________
        
       Displayport: A Better Video Interface
        
       Author : zdw
       Score  : 447 points
       Date   : 2023-07-11 14:50 UTC (8 hours ago)
        
 (HTM) web link (hackaday.com)
 (TXT) w3m dump (hackaday.com)
        
       | tshaddox wrote:
       | > However, I'd like to tell you that you probably should pay more
       | attention to DisplayPort - it's an interface powerful in a way
       | that we haven't seen before.
       | 
       | Lines like that really show just how long it can take for
       | standards to get on the radar of mainstream tech culture. I
       | remember hearing about and being excited about DisplayPort's move
       | to packetized digital video in college in 2008, and seeing the
       | first Macs with MiniDisplay port later that year (or perhaps it
       | was in 2009)!
       | 
       | I was actually under the impression that it has been well-known
       | and commonplace for hobbyist and enthusiast PCs for well over 10
       | years, but I'm probably wrong about that!
        
         | NavinF wrote:
         | > impression that it has been well-known and commonplace for
         | hobbyist and enthusiast PCs for well over 10 years
         | 
         | It is. For a long time DP was the only standard that could do
         | variable refresh rate. Even today all high end monitors have DP
         | while the cheapest monitors only have HDMI.
        
           | fireflash38 wrote:
           | Which is ironic considering the cost of the HDMI port is
           | likely higher than the DP one due to licensing!
        
         | hot_gril wrote:
         | That, and a similar line might've been said about FireWire,
         | which didn't really make it.
        
           | dylan604 wrote:
           | Maybe I agree with that, but I also know that firewire helped
           | usher in the digital video era. It allowed the transition
           | from tape based acquisition when media cards were
           | prohibitively expensive. Audio/Video/Deck control all down
           | one single cable straight from the camera to the computer was
           | what really kicked the prosumer market into being able to
           | lean closer to pro than consumer. Now that media cards are
           | actually affordable, that does seem like ancient history. I
           | could see how you might think of firewire as a failure if
           | you're looking at it as a USB type transition, but for the
           | camera/video professions, it served a very good purpose even
           | if short lived
        
             | pdmccormick wrote:
             | I once purchased a FireWire multi-channel audio interface
             | with the express intent of reverse engineering the protocol
             | in order to add Linux support. I purchased a specialized
             | PCI card that was intentionally non-standard (not OHCI
             | 1394) but could snoop and capture traffic between a device
             | and a host. I chipped away at deciphering a register map
             | for awhile, but eventually better USB audio devices came
             | along that had more driver support.
             | 
             | Still trying to do the same sort of thing though with some
             | audio related USB and Ethernet/IP devices so I guess I
             | never really gave up the idea in my heart!
        
               | dylan604 wrote:
               | you are a glutton for punishment. a hacker after my own
               | heart!
               | 
               | i have so many unfinished things where i just knew
               | something could be figured out, only to not succeed. but
               | it sure is fun trying on top of learning new things as
               | well.
        
               | CameronNemo wrote:
               | Do the USB devices have comparable round trip latency?
               | That is where USB really hurts and where the Thunderbolt
               | audio interfaces seem to fond their niche today.
               | 
               | I'm looking forward to audio interfaces that can do USB4
               | wrapped PCIe, but for now I live with the latency on
               | Linux.
        
               | Blackthorn wrote:
               | RTL latency doesn't seem that bad on chips that are built
               | for the purpose, like RME's. Just the off the shelf stuff
               | might be less than ideal.
               | 
               | If anything I think a problem is there's just overall too
               | much slop in the whole chain, including the operating
               | system, motherboard, whatever else. It's a little
               | ridiculous that a $50 guitar pedal can easily get better
               | RTL numbers than a $1500 computer system.
        
               | CameronNemo wrote:
               | Linux has preempt_rt pretty conveniently accessible, and
               | Pipewire has some useful tuning knobs, but my Steinberg
               | UR44 can't seem to keep up.
        
             | hot_gril wrote:
             | Yeah, FireWire was a necessity at the time for certain use
             | cases. Even basic consumer digital camcorders required
             | FW400 to pull onto a PC.
        
               | dylan604 wrote:
               | I maybe mis-remembering, but was it possible to add
               | FireWire to an existing PC with an expansion card? I know
               | Thunderbolt was not possible from 3rd party vendors and
               | only the mobo manufacturers could offer a card. I bought
               | one to make a Hackintosh, but then a mobo firmware
               | disabled the card because they didn't want to support it.
               | I seem to recall Firewire being the same way
        
             | ShadowBanThis01 wrote:
             | It wasn't that short-lived. All DV cameras had FireWire.
             | Although Sony, per its mania for undermining industry
             | standards, created a bastardized version (called I-Link,
             | ugh) whose connector lacked power and required a physical
             | adapter.
             | 
             | External FireWire drives (like LaCie's) were popular for
             | quite a long time, since they required no extra power
             | source.
        
               | throwawaymobule wrote:
               | I think the 4 pin version was also standard, it's just
               | that Sony didn't own the trademark on FireWire, Apple
               | did. Sony was actually one of the companies that
               | developed the spec, according to Wikipedia.
               | 
               | I had a HP laptop with that connector, but it was
               | labelled IEEE-<numbers>. Bought a 4 to 6 cable to connect
               | it to an old iMac, but never bothered enough to use it.
        
               | MBCook wrote:
               | It was a standard because Sony said "we're doing this"
               | and gave the FireWire group an ultimatum: standardize it
               | or were not on board.
        
               | dylan604 wrote:
               | Sony also gave us a professional studio DVCam deck that
               | was SDI based that allowed you to connect to of them to
               | get 4x dubbing speeds. Not once did it ever get used at
               | the studio I was working. Nothing else in the shop used
               | that format, so it was always just realtime SDI work.
               | ahh, sony
        
         | MBCook wrote:
         | Same. I live in the Mac world. When I took a job where I had to
         | use PCs at work I was surprised to see they were mostly using
         | HDMI with some people using older DVI equipment.
         | 
         | I had just assumed everyone had transitioned similar to Macs
         | has years before.
         | 
         | Nope. Lots of people use/want HDMI to this day.
        
       | tivert wrote:
       | Are there any KVM switches that do Displayport _well_ (i.e. where
       | switching between inputs does not look like a display disconnect
       | to the PC)?
       | 
       | I'm still using HDMI because I like to share my home multi-
       | monitor setup between my personal machine and my work laptop, and
       | the KVM switches are able to fool the PCs into thinking the
       | monitor are always connected. Years ago I tried a Displayport
       | switch, but it could not -- I assume because if the greater
       | sophistication of the Displayport protocol.
        
         | mvid wrote:
         | https://store.level1techs.com/?category=Hardware
         | 
         | This is what I use. It appears to disconnect, but also doesnt
         | seem to be an issue. My machines re-organize instantly.
        
           | zerkten wrote:
           | I was looking at this as an upgrade pick and don't have any
           | re-arrangement with my TESmart (TES-HDK0402A1U-USBK). What
           | monitor(s) do you have?
        
           | formerly_proven wrote:
           | It's gotten better in the last year or so with Windows 10 but
           | it'll still sometimes just fall apart when the display
           | configuration changes, which is something that just never
           | happened for any reason with HDMI/DVI.
        
           | jamestanderson wrote:
           | I got their 10gbps displayport switch to use with switching a
           | single monitor between a Windows desktop PC and an M1 MacBook
           | Pro. I have a 4k@144hz monitor and can get the full framerate
           | and resolution with this setup. I've never had any problems,
           | would highly recommend.
        
             | c-hendricks wrote:
             | Nice, I'll check these out. I went with an HDMI KVM and am
             | worried about updates to HDMI making it obsolete.
        
         | bbatha wrote:
         | I had the startech one the siblings have mentioned but that
         | wasn't very good and didn't do EDID emulation correctly. This
         | CKL one [0] has been working really well, and supports USB 3
         | which is a nice bonus so I can share my webcam. Though
         | sometimes after wake up my macbook forgets about my second
         | monitor (I have an M1 connected to a cable matters thunderbolt
         | dock), my windows machine which has direct DP connections
         | doesn't have the same issue.
         | 
         | 0: https://www.amazon.com/gp/product/B09STVW821/
        
         | stephenr wrote:
         | The new Dell 6K has kvm (and PiP) functionality across its
         | inputs, and it does appear from my modest use of this feature
         | so far, that it works as you would want (ie it still thinks the
         | display is connected, even when not showing that input)
        
         | xxpor wrote:
         | The magic words you're looking for are "EDID emulation". The
         | KVM will continue to send the EDID data from the monitor even
         | after you've switched away, which will fix that issue.
         | 
         | It's relatively uncommon and not always implemented super well,
         | but it's a requirement for any DP KVM to be not super annoying
         | IMO.
         | 
         | There was one particular KVM brand that was supposed to do it
         | well whose name is escaping me now :/. I was looking at buying
         | one in ~ May 2020 for obvious reasons, but they were on super-
         | backorder (also for obvious reasons), so I never got around to
         | it. IIRC they were about $500 for a 4 input/2 output version,
         | so not cheap.
        
           | hazebooth wrote:
           | Were you thinking of Level1techs?
           | 
           | https://store.level1techs.com/products/14-kvm-switch-
           | triple-...
        
             | xxpor wrote:
             | I'm 90% sure it was, but it looks like they did a major UI
             | update to the site so it's not triggering the "that was
             | it!" lightbulb in my brain.
        
               | xboxnolifes wrote:
               | Alternatively, you may have been thinking about
               | ConnectPro. I ordered a kvm from them around the same
               | timeframe and it was delayed quite a bit from backorder.
               | (Though, they also did a major UI change, so might not be
               | able to tell either).
        
               | Our_Benefactors wrote:
               | Two thumbs down for connect pro. I ordered their top of
               | the line 4 computer, 2 monitor DisplayPort KVM and it
               | took months to arrive. I could not cycle between inputs
               | using the buttons. They were more like a suggestion to
               | use that signal path; I would constantly need to power
               | cycle the kvm, monitors, or both.
               | 
               | I ended up ditching it on eBay at a significant loss for
               | a $30 usb switch and just switch monitor inputs manually.
               | Far cheaper solution and way less fussy.
        
               | AlotOfReading wrote:
               | I use one, so I can confirm they work extremely well,
               | subject to some caveats:
               | 
               | * The _total_ cable length is important, both between the
               | host  / KVM and the KVM / monitor, as well as any daisy
               | chained displays you have. I had to use certified cables
               | to get everything working reliably with my setup.
               | 
               | * There's a weird interaction with BIOS power on. The
               | boot display drivers I have freak out if they aren't the
               | active display and fail. I solve this by switching the
               | KVM _before_ I turn the computer on. After everything is
               | booted into an OS, it works fine to switch.
               | 
               | * Power supply quality is important. I had some issues
               | before I made sure the power supply was reliable.
               | 
               | KVM switches are just inherently difficult little
               | devices. I haven't had issues since I got it working
               | though.
        
               | drbawb wrote:
               | >There's a weird interaction with BIOS power on. The boot
               | display drivers I have freak out if they aren't the
               | active display and fail. I solve this by switching the
               | KVM before I turn the computer on. After everything is
               | booted into an OS, it works fine to switch.
               | 
               | Do you have an AMD GPU by any chance? I have the
               | level1tech 2-head DP 1.4 KVM, with an AMD RX 560 on a
               | Linux host, and after updating to kernel 6.4 recently my
               | computer now boots fine without a monitor attached.
               | 
               | I had a similar issue where a display had to be _on_ and
               | _connected_ (i.e: active on the KVM) at boot time, or the
               | GPU wouldn't work at all. I could get in via SSH, so I
               | tried various amdgpu recovery options, poking the device
               | to reset it, reloading the kernel modules, etc., and
               | never had any luck. I just lived with the quirk. It was
               | problematic because if you left home with the KVM
               | selected on the Windows guest, and needed to reboot the
               | Linux host remotely, you'd come home to a non-functional
               | Linux desktop.
        
               | xxpor wrote:
               | I have a similar issue with a Nvidia 1080Ti on my old
               | desktop. It's related to the UEFI deciding if the iGPU or
               | the Nvidia GPU should be primary and to disable the iGPU
               | or if the iGPU should stay enabled.
        
               | hiatus wrote:
               | Curious how you went about determining that was the
               | source of the issue.
        
               | AlotOfReading wrote:
               | Nvidia on both, one consumer and one workstation. Neither
               | CPU has built-in graphics iirc, nor do the motherboards
               | expose the ports.
        
           | Fnoord wrote:
           | BliKVM PCIe I bought (based on PiKVM) came with an EDID
           | emulator.
        
         | seanalltogether wrote:
         | Can you explain why this is beneficial? I have a mac laptop and
         | pc desktop at home that i switch between depending on whatever
         | I need to do. By triggering a disconnect, it means all my mac
         | windows that are on the main monitor will zip back over to the
         | laptop so they're still reachable if i need to access them with
         | the trackpad and integrated keyboard. When I switch the kvm
         | back to mac all those windows jump back to the main monitor.
        
           | pavon wrote:
           | Flaky drivers. KVM induced unresponsiveness is pretty much
           | the only reason I ever have to hardboot my computers.
           | 
           | Also, even if the drivers are solid, they take longer to
           | renegotiate with a monitor that was removed and plugged back
           | in compared to one they think was always there, which matters
           | if you switch back and forth frequently.
           | 
           | Lastly, sometimes the OS doesn't put things back they way
           | they were when you plug a monitor back in. If you have a
           | laptop which has a lower resolution display than the external
           | monitor, you'll often return to find all the windows shrunk
           | to fit the laptop display. Not an issue if you run everything
           | full-screen, but annoying if you tile windows.
        
         | jchw wrote:
         | None of them are perfect, but I've heard good things about the
         | DP switch from Level1techs. The thing is, all of them are a
         | little tricky, but they mostly differ in how quirky they are,
         | and I suspect the reason why people like the Level1techs DP
         | switch is that they seem to at least try to alleviate some of
         | the issues DP switches tend to get into.
         | 
         | https://store.level1techs.com/products/14-kvm-switch-dual-mo...
         | 
         | The startech one I have is alright... But Apple computers
         | absolutely hate it and frequently refuse to display, and
         | sometimes Windows gets stuck and USB devices stop working.
         | Strangely enough... Linux doesn't ever have any problems with
         | either display or input. A rare win, but fine by me.
        
           | pxc wrote:
           | The Level1Techs KVM switches are rebranded Startech switches
           | with 'dumber' firmwares whose dumbness affords better
           | compatibility with niche DisplayPort features.
           | 
           | I have a bunch of them and I like them pretty well, but
           | getting a bunch of computers all plugged in turns out to be a
           | bit of a nightmare, especially when you need some long-ish
           | cable runs or you are daisy-chaining devices (e.g., multiple
           | KVM switches, adding USB hubs or Thunderbolt docks, etc.).
           | 
           | The Level1Techs KVM switches don't meet GP's criterion for
           | hotplugging behavior, unfortunately. Switching between
           | devices is just an unplug and replug for them.
           | 
           | Like you, I've found that macOS and Windows don't handle
           | hotplugging monitors well, but Linux desktops (in my case,
           | KDE Plasma) consistently do the right thing and don't require
           | a repeater to lie to them about monitors always being plugged
           | in.
           | 
           | FWIW, I don't get the 'Apple computers just refuse to work'
           | issue with any of my L1T KVMs.
        
             | ender341341 wrote:
             | > FWIW, I don't get the 'Apple computers just refuse to
             | work' issue with any of my L1T KVMs.
             | 
             | My work intel macbook worked great on it for like a year
             | once I got a high quality usb-c -> displayport cable, but
             | an os update borked it... though it's definitely the mac
             | that's the problem as it also has problems sometimes with
             | just strait to the monitor too (on the other hand 49"
             | ultrawide is pushing the bandwidth near it's limits).
             | 
             | My personal arm macbook has always worked great on it with
             | even a crappy usb-c -> displayport
             | 
             | my windows desktop also always worked great on it.
        
               | pxc wrote:
               | > on the other hand 49" ultrawide is pushing the
               | bandwidth near it's limits
               | 
               | One of the things I've learned too late is that using
               | DisplayPort (and maybe HDMI, idk) anywhere near its
               | bandwidth limits is not worth it for me. Having to think
               | about cable run lengths as well as cable quality and
               | peripheral quirks and internal Thunderbolt dock bandwidth
               | allocation and so on and so on just fucking sucks.
               | 
               | It'll probably take until the next/latest (2.1)
               | generation of DisplayPort propagates before using
               | multiple monitors with specs similar to my current ones
               | (high refresh rate and HDR, but not even HiDPI) isn't
               | painful, cumbersome, and finicky.
               | 
               | I probably won't be able to use them by then anyway. Ugh.
        
             | baby_souffle wrote:
             | > The Level1Techs KVM switches are rebranded Startech
             | switches with 'dumber' firmwares whose dumbness affords
             | better compatibility with niche DisplayPort features.
             | 
             | Rextron [1] is the actual ODM. They don't do any direct to
             | consumer sales, though. That's why L1 / Startech / other
             | "brands" sell them on amazon and the like.
             | 
             | Last I spoke with the L1 guy, they were still having some
             | issues with the high speed USB-C switching chips on the 1x4
             | "all USB-C" model that he's got a wait-list for.
             | 
             | [1] https://www.rextron.com/KVM-Switches.html
        
               | pxc wrote:
               | Ah! Great info. Thanks :)
        
           | Steltek wrote:
           | I had much more serious problems with a StarTech DP KVM and
           | Macs. My Macbook would hang and crash-reboot. Both on the
           | initial plug-in and on switching inputs.
           | 
           | Everything else seemed to handle it fine with Linux being
           | especially unphased, as usual.
        
       | [deleted]
        
       | shmerl wrote:
       | _> Plus, It's Not HDMI_
       | 
       | This. HDMI and its cartel that profits from its patents are super
       | annoying.
        
       | cat_plus_plus wrote:
       | Maybe it is, but I also want audio, so the point is moot.
        
         | goombacloud wrote:
         | Audio is covered as well, as written in the article, I
         | recommend reading it.
        
       | TazeTSchnitzel wrote:
       | The amazing thing about DVI, and by extension HDMI, is that it's
       | just VGA but digital, with all the timing and synchronisation
       | complexity that implies. Recall that DVI can have both an analog
       | (VGA/DVI-A) and digital (DVI-D) signal in the same cable. They
       | aren't independent; they share some pins and have the same
       | timing. You could have made a CRT monitor that used DVI-D just by
       | adding a DAC, though I'm not sure if anyone ever did.
       | 
       | DisplayPort does away with all that legacy. I assume the hardware
       | to implement it is much simpler and more reliable.
        
         | rasz wrote:
         | >DisplayPort does away with all that legacy.
         | 
         | You wish. DP sends exactly same bytes DVI does (blanking and
         | all), just broken up into packets.
        
           | fanf2 wrote:
           | The important difference is that (like VGA) DVI and HDMI
           | still dedicate particular physical wires to separate red,
           | green, and blue channels. Display port does not shard
           | individual pixels across multiple channels: all bits (all
           | colours) of a pixel are sent on the same diffpair. If a DP
           | link has multiple lanes then the bits of different successive
           | pixels are sent on each lane.
        
             | rasz wrote:
             | This is great, but then they drop the ball by forcing
             | garbage ("interspersed stuffing symbols") between the
             | packets instead of letting you use that BW.
        
         | madars wrote:
         | This also has side-channel implications, and is a reason why
         | eDP is recommended in voting applications.
         | https://www.eerstekamer.nl/nonav/overig/20160428/richtlijnen...
         | 
         | >DisplayPort uses a scrambler as part of its line encoding in
         | order to flatten the Fourier spectrum of its emissions and
         | suppress spectral peaks caused by particular image contents.
         | This reduces the chances of any particular image content
         | causing a problem spectral peak during SDIP-27 and EMI spectrum
         | measurements. According to the standard, the scrambler reduces
         | spectral peaks by about 7 dB. As a side effect, the scrambler
         | also makes it far more difficult, probably even impractical,
         | for an attacker to reconstruct any information about the
         | displayed image from the DisplayPort emissions. [..]
         | DisplayPort uses a small number of fixed bit rates, independent
         | of the video mode used. Unlike with most other digital
         | interfaces, video data is transmitted in data packets with
         | header and padding bytes, and not continuously with a
         | television-like timing. As a result, DisplayPort cables are not
         | a common source of van-Eckstyle video emanations and this again
         | will make it very hard for an eavesdropper to synchronize to
         | the transmitted data.
         | 
         | Speaking of which: how does one force HDCP (say, in Linux)?
         | That would transform this technology from a DRM nuisance
         | (strippers easily available on AliExpress) into a Van Eck-
         | countermeasure.
        
           | angry_octet wrote:
           | HDCP can't really be stripped, if it's on then something is
           | an HDCP receiver. For v1.4 signals you can probably get
           | something with the leaked key, but not for v2.2 as yet.
           | 
           | HDCP itself needs to be licensed in order to generate it, and
           | that licence is only available to HDMI Adopter companies.
           | However there is code to get some chipsets to turn it on:
           | https://github.com/intel/hdcp
           | 
           | Grab it now before Intel deletes it completely.
           | 
           | HDCP is really a very ugly protocol designed just for anti-
           | copying, I wouldn't build anything relying on it, and
           | everything is harder with HDCP from an AV integration
           | perspective. If you have long links that you want to secure,
           | use something like SDVoE with encryption and authentication
           | (bits are easily flipped in HDCP).
        
             | madars wrote:
             | There are devices that will give you HDCP 2.3->1.4
             | conversion for compatibility reasons (or maybe wink-wink
             | reasons), and after that you can use a HDCP 1.4 stripper.
             | There are also devices that will strip 2.2 outright (they
             | are marketed as HDMI splitters but, oops, one of the two
             | outputs has unencrypted signal, an honest mistake); not
             | sure about 2.3 though. Completely agree that HDCP makes
             | everything harder, but in this scenario it would also make
             | attacker's job of descrambling the eavesdropped signal
             | harder. And this is after they have overcome packetization
             | and spread spectrum hurdles, which the paper suggests is
             | very challenging to do, so would be a (hacky) defense-in-
             | depth for DisplayPort. Even more important for HDMI/DVI
             | which lacks the aforementioned hurdles.
        
         | MisterTea wrote:
         | > You could have made a CRT monitor that used DVI-D just by
         | adding a DAC, though I'm not sure if anyone ever did.
         | 
         | From memory IBM offered a crt with DVI.
        
           | thereddaikon wrote:
           | Plenty did but they were usually DVI-A or DVI-I
        
         | pavlov wrote:
         | I was guessing that the oddly beautiful 17" Apple CRT Studio
         | Display [1] from 2000 might have been a CRT that used a digital
         | signal cable because it had the short-lived Apple Display
         | Connector, but apparently ADC carried analog too.
         | 
         | [1]
         | https://everymac.com/monitors/apple/studio_cinema/specs/appl...
        
           | ShadowBanThis01 wrote:
           | The ADC carried power, too, which is why Apple went with it.
        
       | [deleted]
        
       | pid-1 wrote:
       | Sorry if that sounds stupid, but why isn't Ethernet used for this
       | sort of high bw signaling?
        
         | kimburgess wrote:
         | Because (consumer) ethernet is low bandwidth compared to whats
         | needed for video. DisplayPort 2.0 provide a 80Gbps link. HDMI
         | is 48Gbps. Both already use DSC which is a mezzanine
         | compression as this is already saturated for immediate needs.
         | 
         | Ethernet is absolutely used for media transport outside of the
         | consumer space. Particularly where there's a need to distribute
         | signals further than what DP or HDMI can provide. The challenge
         | is this always comes with a trade off. Any compression will
         | result in some mix of a loss of image quality, increase to
         | latency, and cost. What that mix looks like differs across
         | applications.
         | 
         | See https://sdvoe.org/, https://ipmx.io/,
         | https://www.intopix.com/flinq,
         | https://www.aspeedtech.com/pcav_ast1530/ for more in that
         | space.
         | 
         | https://hdbaset.org/ is the other key tech. That uses the same
         | PHY as ethernet, but is a proprietary signal.
        
           | chrisco255 wrote:
           | The current Ethernet spec supports up to 1000 Gbps, but I
           | think 40 Gbps is about as fast as you can find on the market
           | right now. They could build multi Gbps wifi routers too, it's
           | just not common in the consumer market.
        
             | akvadrako wrote:
             | Even 10Gb ethernet isn't suitable for consumers. 40Gb is
             | practically only fiber which has never been successful in
             | homes since it requires more careful handling.
        
         | fsh wrote:
         | The bandwidth is much too low.
        
           | londons_explore wrote:
           | If this were true, then people would be running datacenter
           | networking over HDMI.
           | 
           | Data is data - there is no benefit to designing two entirely
           | different transport mechanisms for video vs other data.
           | 
           | You could argue that video data has a deadline to meet - but
           | ethernet already has lots of mechanisms for QoS and bandwidth
           | reservations to make sure that someones torrents don't
           | interrupt something latency sensitive. Sure, they aren't
           | widely supported over the internet, but between your PC and
           | your display - they can be engineered to work properly.
        
             | FuriouslyAdrift wrote:
             | HDMI has been used for stacking ports for years... I think
             | it's gone out of fashion, though.
             | https://www.dell.com/support/kbdoc/en-us/000120108/how-to-
             | st...
        
             | wmf wrote:
             | 25 Gigabit Ethernet costs ~100x more than 32 Gigabit
             | Displayport.
        
               | aidenn0 wrote:
               | Sounds like the real question is why don't people use DP
               | for p2p networking links...
        
               | xienze wrote:
               | There's nothing magic about DP cables, they're just
               | typically a short-ish length, which is why they can be
               | high bandwidth and relatively cheap. Once you get to a
               | decent length, the cables get really expensive because
               | you have to put optical transceivers on either side. I
               | have a 25 foot DP cable which cost over $100, for
               | example.
               | 
               | So you could do a short length, high bandwidth Ethernet
               | cable, I'm sure. But the reason we don't is probably
               | because differentiation between connectors is desired ---
               | consumers are dumb, frankly. They'll think any old
               | Ethernet cable will work. Just look at what kind of a
               | mess exists with USBC.
        
               | ianburrell wrote:
               | The commonality of connector is important. Ethernet has
               | stopped using twisted pair cables for higher speeds. The
               | SFP transceiver means can use short DAC copper cable or
               | fiber optics for longer distances. The DAC cables that
               | would compete with DisplayPort are pretty cheap.
               | 
               | Also, a big portion of the cost of networking is in the
               | switch that can handle high bandwidths. My guess is that
               | 25G DisplayPort switch would be just as expensive as 25G
               | SFP one.
        
               | aidenn0 wrote:
               | Yeah, my original comment was based on the "100x more
               | expensive" assertion. a 5m DAC cable is $40 on Amazon,
               | and 25GBe cards can be had for under $200. I think it's
               | probably more than $2.40 for just a 5m DP cable, so 100x
               | was a gross exaggeration.
        
               | wmf wrote:
               | It's unidirectional and short reach.
        
             | kevin_thibedeau wrote:
             | HDMI is severely length limited. Ethernet has no
             | millisecond level latency guarantees. Data isn't always
             | just data.
        
             | yellowapple wrote:
             | > If this were true, then people would be running
             | datacenter networking over HDMI.
             | 
             | Why would they? If they need more bandwidth than what
             | Ethernet provides, they'd probably just bite the bullet and
             | go with fiber.
             | 
             | ...which punts the previous question to why the industry
             | hasn't gone all-in with fiber optic display connectors,
             | like it has with TOSLINK for S/PDIF in the audio world.
        
               | zokier wrote:
               | Ethernet over fiber is still ethernet.
        
               | wmf wrote:
               | TOSLINK was a gimmick. In reality, optical transceivers
               | are expensive and are only used when copper can't do the
               | job.
        
               | xxpor wrote:
               | TOSLINK/S/PDIF is way out of date and no one uses it any
               | more. Ironically, because it doesn't have enough
               | bandwidth for things like Atmos.
        
         | londons_explore wrote:
         | It should be.
         | 
         | As soon as video data is in packets, it should be routable over
         | a network like any other data.
         | 
         | Give up the custom connectors, custom cables, etc.
         | 
         | Need a wireless display? No worries, we can route the data over
         | wifi too.
         | 
         | Need 5 displays? Thats what an ethernet switch is for. Screen
         | mirroring? We have multicast.
         | 
         | Need a KVM? Well you can probably write a few scripts to change
         | which computer your screen gets it data feed from.
         | 
         | I believe this hasn't happened simply because audio/video
         | people like going to conferences to design their own standards
         | every year, keep their licensing royalties, keep their closed
         | club - a software-only solution over any IP network isn't going
         | to fly.
        
           | tverbeure wrote:
           | They write that DP uses packets, but it's still an
           | isochronous connection with guaranteed BW allocation.
           | 
           | For video timings with high pixel clocks, the bandwidth used
           | to transfer a signal is very close to what's theoretically
           | available. There's no way you'd be able to do that reliably
           | over something like an Ethernet cable.
           | 
           | There's nothing sinister about how video standards are
           | designed. From DP1.0 to DP1.4, which spans more than 10
           | years, all changes have been incremental and most just a
           | matter increasing data rates while the coding method stayed
           | the same.
           | 
           | They need a system that's very high and guaranteed bandwidth,
           | high utilization, very low bit error rate, and cheap.
           | 
           | Even today, a 10Gbps Ethernet card will set you back $90. And
           | that will carry less than half the data that can be
           | transported by DP 1.4 which is old school by now.
        
           | kccqzy wrote:
           | Have you tried VNC over an Ethernet cable at all? What you
           | are proposing is basically having a VNC-like protocol over IP
           | over Ethernet. Now try it out and see how well it works.
           | 
           | To be even more reductive, a thought experiment is to
           | consider the classic USB. Why did people even invent USB in
           | the first place in the 1990s? The first USB had only 12Mbps,
           | not much better than the first Ethernet at 10Mbps which had
           | existed since 1980! Why didn't people who invented USB simply
           | use Ethernet?
        
           | NavinF wrote:
           | Naw it's because >=40G Ethernet never saw much consumer
           | adoption. QSFP DACs and optics are also kinda bulky by
           | consumer standards so a lot of people would want a new
           | connector. Even LC connectors would be too large for laptops.
        
         | [deleted]
        
         | FuriouslyAdrift wrote:
         | Dante AV does this. I'm sure it's expensive, but it exists fro
         | commercial applications.
         | https://www.audinate.com/products/manufacturer-products/dant...
        
       | muhammadusman wrote:
       | This post was informative and I didn't realize just how different
       | DisplayPort is from HDMI. Recently, I got a desktop dock that
       | uses DisplayPort instead of HDMI to connect to my monitor. My
       | monitor has 2 HDMI ports, 1 Type-C, and 1 DisplayPort. So far
       | things have been fine but I did notice that the audio is choppy
       | no matter what I do. I thought it was the dock but audio going
       | from my computer > dock > my webcam's speaker works fine (all
       | over usb-c). So unfortunately, it leads me to believe that the
       | DisplayPort is causing this jittery audio.
        
         | smachiz wrote:
         | It might be your dock.
         | 
         | Check to see whether it's USB or Thunderbolt. Thunderbolt docks
         | are more expensive, but considerably more efficient and faster
         | than USB (assuming your laptop/device supports Thunderbolt).
         | 
         | Thunderbolt docks are basically PCIe extension devices, whereas
         | USB docks attach everything as USB, with all the common
         | challenges USB has on systems (like dropped audio when the CPU
         | is busy).
        
           | muhammadusman wrote:
           | Thanks for the info. The dock is the CalDigit TS3 Plus dock
           | and I'm using it with an LG monitor. Their page says it's a
           | Thunderbolt dock so I wonder if there's anything else about
           | this particular dock that could be causing this issue. Btw
           | when the monitor was connected over HDMI, it was fine playing
           | audio.
           | 
           | https://www.caldigit.com/ts3-plus/
        
             | MBCook wrote:
             | That is absolutely TB and was one of the recommended docks
             | in the Mac world for a long time.
             | 
             | Have you talked to support? Or I wonder if it's an issue on
             | the LG side.
        
             | SV_BubbleTime wrote:
             | No one would be able to answer this for you in your house
             | with your cables with your speakers. Get another one and
             | test it.
        
         | jbverschoor wrote:
         | Same here with an lg monitor on dp
        
       | remix2000 wrote:
       | With HDMI, I have both video _and_ audio traveling over a single
       | cable. That is _extremely_ convenient. DP _theoretically_
       | supports audio too, but AFAIK that 's not widely implemented.
        
         | theandrewbailey wrote:
         | The audio receiver industry has settled on HDMI being the only
         | way to get uncompressed surround sound (5.1+) from an external
         | device, like a PC. I wish they would do that over USB, too. It
         | sucks having to configure an external display (even if
         | duplicated) just to get good sound.
        
         | [deleted]
        
         | lucideer wrote:
         | > _That is _extremely_ convenient._
         | 
         | I'm open to having my mind changed but I have not really
         | experienced this extreme convenience in practice. The _vast_
         | majority of things I plug a HDMI cable into either don 't have
         | speakers or if they do, the speakers are a bad quality
         | afterthought.
         | 
         | In actual fact, what has been an extreme inconvenience is the
         | OS thinking I want HDMI audio instead of whatever much higher-
         | quality* alternative output I actually want to direct my audio
         | to.
         | 
         | The _only_ time I 've ever gotten high-quality audio output
         | over HDMI is via ARC, which says a lot about the need for HDMI
         | audio...
         | 
         | * When I say "much higher-quality", I don't mean HDMI is a low-
         | quality audio transport, I mean the HDMI output devices more
         | often than not have inferior audio output to some other
         | bluetooth / 3.5mm jack device I am using.
        
           | J_Shelby_J wrote:
           | Audio over HDMI or DP has never been anything but a nuisance
           | on my windows workstations. Windows finds a way to set the
           | speaker monitor/mic as the system defaults at least once a
           | month for me. Every GPU driver update I end up disabling the
           | display's audio in device manager. Hrm... maybe I could write
           | a script to do it at boot.
           | 
           | I think it could make sense in budget friendly setups, but...
           | personally I'd pay extra to not have to deal with the
           | nuisance and security risk of microphone input I can't
           | physically disable.
        
           | mywittyname wrote:
           | I use my PC to drive a TV for gaming and all I have to do is
           | switch the sound output in Windows from Speakers to TV and
           | I'm good to go. No need for splitters and multiple cables to
           | get sound to both devices. The only thing I would change is
           | to have Windows support outputting audio to both devices
           | regardless, but keeping the settings menu open to the right
           | page isn't really an inconvenience.
           | 
           | All of my consoles work the same way: just run a single HDMI
           | cable for each one.
           | 
           | The counter point might be that my TV would have "bad audio."
           | But I don't think so (at least not at the volumes I prefer),
           | and even if it did, it supports audio out to connect the TV
           | audio to a separate hi-fi.
        
           | aidenn0 wrote:
           | I plug my computer into my AV receiver all the time, and my
           | speakers are worth more than my TV.
        
             | bleepblop wrote:
             | If your receiver doesn't support eARC, you are definitely
             | not getting full resolution audio.
        
               | aidenn0 wrote:
               | I don't use ARC at all (all audio sources go through the
               | receiver), so that's a non-issue for me.
        
           | ShadowBanThis01 wrote:
           | Do you not have a proper home A/V setup?
           | 
           | Almost everything I use for media sends audio and video to my
           | receiver over HDMI; the exception is analog sources, which go
           | into a mixer whose output is digitized and sent over CAT-6
           | cable to an optical converter sitting on the receiver.
        
           | hot_gril wrote:
           | TV connection is a pretty big use case where audio matters.
        
           | remix2000 wrote:
           | I have an audio extractor that conveniently passes the sound
           | from HDMI to my desktop speakers. The extractor is connected
           | between my HDMI input switcher and the [primary] monitor, so
           | whichever device I'm currently using outputs the audio to the
           | speakers.
        
           | taeric wrote:
           | It was very convenient for me to have my headphones plugged
           | into the monitor. Made it so I could plug a single cable over
           | to my laptop and have basically everything else already done.
        
             | IggleSniggle wrote:
             | Every monitor I've ever had has had terrible coil-whine
             | from the headphone jack. I always try them for exactly this
             | reason, and get super annoyed each time. It didn't have to
             | be this way!
        
               | deergomoo wrote:
               | I have this problem on my gaming monitor, thankfully it
               | also has USB-A ports so a cheap adapter has solved it.
               | 
               | I have no idea if there are downsides to audio over USB-A
               | but for my fairly basic use case ("being able to hear
               | things and not hear coil whine") it works pretty well.
        
             | lucideer wrote:
             | > _headphones plugged into the monitor_
             | 
             | That sounds cool but I've never see a monitor with a jack.
             | Mine all have many USB ports, but not audio. How common are
             | they?
        
               | sublinear wrote:
               | I have a 3.5mm audio jack on a cheap 4k LG monitor I've
               | been using for years
        
               | dizhn wrote:
               | An audio jack is very common. Some even come with
               | speakers. They are usually not very good but they get the
               | job done in a pinch.
               | 
               | I use 2 computers connected to the same monitor via hdmi
               | and use barrier (and a script) to switch screens. The
               | audio for the correct monitor gets activated
               | automatically. It's very convenient.
        
               | dekhn wrote:
               | All the dell midrange monitors I've bought (IE, $400-$1K)
               | have audio jack for headphones.
        
               | yonatan8070 wrote:
               | I have a 144Hz 1080p AOC monitor and a secondary generic
               | LG one, both have an audio jack
        
           | skipnup wrote:
           | One extremely convenient situation for me is getting sorround
           | sound from the TV to the AV receiver in a single cable.
        
           | babypuncher wrote:
           | HDMI was designed to accommodate home theaters. That usually
           | means a sound system with good speakers. Not every user needs
           | this expanded feature set, but enough do that DisplayPort
           | does not make a good alternative.
           | 
           | Here's where ARC/eARC really benefit me, and I don't see how
           | this problem could be solved with DisplayPort.
           | 
           | I have a PC and a PS5 that support Variable Refresh Rate
           | (VRR) over HDMI. I bought a new TV last year that also
           | supports VRR, but I did not want to replace my perfectly
           | adequate 3 year old reciever that does _not_ support VRR.
           | Even today, VRR support on recievers is questionable at best,
           | and most users likely want their VRR devices plugged directly
           | into their TV.
           | 
           | Thanks to eARC, I can plug my PS5 or PC directly into my TV
           | and have it send the PCM 7.1 surround sound audio to the
           | reciever over HDMI. I can still leave other devices like an
           | Apple TV or Blu-Ray player plugged into the reciever, and
           | everything just works.
           | 
           | Without eARC, I would have to fall back on TOSlink. That
           | means extra cables, and dropping down to compressed dolby
           | digital 5.1 if I want to keep surround sound. Using dolby
           | digital on a game console incurs a pretty noticeable latency
           | penalty, which is why they all default to PCM.
        
           | inconceivable wrote:
           | i have 3 computers plugged into my 4k tv (monitor) via hdmi.
           | tv to dac via optical. works fine for audio+video at 4k60hz.
        
           | sleepybrett wrote:
           | I have never once used hdmi audio for anything but an a/v
           | type experience. Monitor speakers are generally garbage,
           | monitors generally don't support bluetooth for audio
           | connectivity, their headphone jacks are often in hard to
           | reach places, and I have yet to find on that supports a
           | microphone. Just easier to go directly to the machine for
           | audio.
        
         | nfriedly wrote:
         | I have two different LG monitors that both support audio over
         | the DP cable.
        
         | Sweepi wrote:
         | I have been using DP+Audio for the last decade with a wast
         | array of different monitors.
        
         | leoc wrote:
         | Unfortunately HDMI audio is also notorious for often having
         | serious lag, both in absolute terms and relative to the video
         | signal.
        
           | bravo22 wrote:
           | The issue is not with HDMI. The audio data is sent between
           | each frame. The lag comes from the fact that TVs apply post-
           | processing to video which causes it to lag relative to audio.
        
           | ulrikrasmussen wrote:
           | Isn't the whole idea of routing the audio through the display
           | that it can compensate for any lag due to processing of the
           | video signal? I haven't really noticed any problems with
           | HDMI, and if I did, I think I would blame the display.
        
             | michaelt wrote:
             | No, the point of supporting audio and video on the same
             | cable is for people plugging their DVD player into their
             | TV.
        
           | kimburgess wrote:
           | That's not HDMI causing that pain. In fact, HDMI includes
           | specific functionality for lip sync correction and enabling
           | devices to expose both audio and video latencies they add to
           | the signal.
        
           | solarkraft wrote:
           | Are you sure that it's related to HDMI? _TVs_ often do
           | processing on the video signal which introduces lag, but my
           | Samsung TV in  "PC Mode" is virtually lag-free (and I use
           | Logitech's receiver for my mouse because Bluetooth feels
           | slightly too laggy for me).
        
             | kukx wrote:
             | Seems like a bug on the tv part then. They should delay
             | audio accordingly to the lag.
        
           | metaphor wrote:
           | Unless the standard has been substantially relaxed since HDMI
           | v1.3a, it sounds like a downstream issue.
           | 
           | Citing SS 7.5 normative language:
           | 
           | > _An HDMI Source shall be capable of transmitting audio and
           | video data streams with no more than +-2 msec of audio delay
           | relative to the video._
        
           | AndrewOMartin wrote:
           | Sounds like you need to upgrade to gold plated HDMI.
        
             | selykg wrote:
             | Better get that Monster Cable!
        
         | tverbeure wrote:
         | All modern GPUs and monitors (that's have a speaker) support
         | audio-over-DP. They have for over a decade.
         | 
         | It's widely implemented.
        
         | naikrovek wrote:
         | are you thinking of DVI or something?
         | 
         | I've been using audio over DisplayPort for years and years and
         | years without issue. every device I've ever had that supports
         | DisplayPort supports audio over that connection.
        
         | [deleted]
        
         | haunter wrote:
         | > DP _theoretically_ supports audio too, but AFAIK that's not
         | widely implemented
         | 
         | I'm using my monitor's speaker through DP since GTX 1070
         | (2016). Currently RTX 4070 Ti and that works too. And it's a
         | pretty mediocre AOC monitor I have.
        
         | scarface_74 wrote:
         | I have a portable external monitor that is powered and gets
         | video via a single USB C port which I assume is using DP. It
         | also does audio over the cable.
        
         | adrian_b wrote:
         | I have rather old NVIDIA GPUs and Dell monitors, and both the
         | GPUs and the monitors support audio over DisplayPort.
         | 
         | I doubt that there exists any reasonably recent GPU that does
         | not support audio over DisplayPort, so if there are problems
         | they might be caused by the monitors. I have no idea how many
         | monitors have speakers and DisplayPort without supporting audio
         | over DP.
        
         | gjsman-1000 wrote:
         | For some reason, outside of video, HDMI knows how to shoot
         | themselves in the foot almost as well as the USB-IF.
         | 
         | HDMI CEC? Has anyone actually got that working correctly?
         | 
         | Anyone remember Ethernet over HDMI? Apparently that was a
         | thing. (Not to be confused with HDMI over Ethernet, which
         | actually has some uses.)
         | 
         | HDMI for soundbars? We got ARC ports (Audio Return Channel).
         | But it was buggy, didn't support lossless audio, needed new
         | DRM, and was bad at maintaining lip sync, so introducing eARC (
         | _Enhanced_ Audio Return Channel). eARC works by... scrapping
         | Ethernet over HDMI and reusing the wires. Better get a new TV
         | that supports it, there 's no adapters (downgrading to regular
         | ARC if any part of the chain doesn't support eARC).
        
           | aidenn0 wrote:
           | CEC "just works" for me and has for as long as I've owned a
           | flat panel TV (maybe 12 years or so?). Back in the day
           | (before I had an HDMI receiver), I could use the TV remote to
           | control the bluray player. These days I have a receiver, and
           | the volume control on pretty much any device will control the
           | audio on the receiver.
           | 
           | ARC was a bit more spotty, but I moved somewhere with no live
           | TV reception, so my only use-case for it went away
           | (everything else routes audio through the receiver).
        
           | ApolIllo wrote:
           | HDMI-CEC is great for TVs but when I searched recently I
           | found that both Nvidia and AMD do not support HDMI-CEC. This
           | was disappointing to find out when I built an HTPC for my TV.
        
           | scarface_74 wrote:
           | > HDMI CEC? Has anyone actually got that working correctly
           | 
           | I stay in lots of hotels and take my Roku stick. Most of the
           | time the Roku remote can control the TV volume and power
           | using CEC
        
       | hot_gril wrote:
       | "We've all gotten used to HDMI." Well that's it then, I don't
       | need another less common way to do the same thing. There's always
       | that theory that low-end devices will opt for DP to avoid
       | royalties, but the cheapest laptops and monitors tend to be HDMI-
       | only if anything because DP is more of a special feature.
       | 
       | Similar story with H.264/5 vs VP8/9.
        
         | gs17 wrote:
         | > There's always that theory that low-end devices will opt for
         | DP to avoid royalties, but the cheapest laptops and monitors
         | tend to be HDMI-only if anything because DP is more of a
         | special feature.
         | 
         | Yeah, when I needed a new laptop dock, one of my annoyances was
         | having to pay extra for a DP one because our lab's IT refuses
         | to stock adapters or cables "because people keep using them
         | instead of returning them" and they "spent extra" to get
         | monitors which have a single DP port.
        
       | bee_rider wrote:
       | I just like the little retaining clips, and the satisfying little
       | click you hear/feel when plugging them in.
        
         | green-salt wrote:
         | I have a love/hate relationship with them for when the release
         | tab is in a hard to reach area or doesn't release well, but
         | that's usually the cable's fault. Love that tactile sensation
         | of it being properly mated.
        
         | ComputerGuru wrote:
         | It's a problem when the x16 slot is the bottom-most and the
         | orientation is such that the clicky thing you must depress is
         | on the _bottom_ side of the DP connector when plugged in, and
         | your case has a ledge after the PCI slots so you cannot
         | realistically depress the latch.
         | 
         | Of course, no one would realistically encounter a
         | case/GPU/cable combo like that. Right?
        
           | jacquesm wrote:
           | Hm, my sample for that situation clocks in at about 100%
           | (n=1). Especially the first time I needed to unplug that
           | cable I was thrown for a bit, I ended up taking the whole
           | case off the floor and using a thin screwdriver to depress
           | the release button from the exposed side, that did the trick.
        
           | NavinF wrote:
           | Your GPU should be in the x16 slot closest to the CPU because
           | it has the most bandwidth.
           | 
           | tbh this sounds like a self-inflicted SFF build problem
        
       | chx wrote:
       | > However, if you don't need the full bandwidth that four lanes
       | can give you, a DP link can consist of only one or two main lanes
       | resulting in two
       | 
       | This is how many USB C docks operate. The USB C connector also
       | has four high speed lanes and there's a mode where two are
       | assigned to carry DisplayPort data and two are assigned to be TX
       | and RX respectively for USB. Until DP 1.4 appeared, this meant
       | you were quite limited in display resolution if you didn't have
       | Thunderbolt and wanted faster than 480mbps data speed. With DP
       | 1.3, two HBR3 lanes can carry 12.96 Gbit/s which is almost exacly
       | the requirement for 4k @ 60Hz at 12.54 Gbit/s. DP 1.4 adds
       | compression on top of this. One more beauty of DisplayPort is
       | it's point to point so it's entirely possible your USB C hub will
       | carry the display data from the host to the hub as compressed
       | HBR3 data over two lanes and then hand it out to two monitors
       | over four uncompressed HBR2 lanes to each so a modern USB C hub
       | can, without Thunderbolt, drive two 4k @60Hz monitors and still
       | provide multiple gigabit speed data. It's a very neat trick. This
       | needs full DisplayPort 1.4 support including DSC in the host, for
       | integrated GPUs in laptop CPUs this means AMD Ryzen 4000 and
       | newer or Intel Tiger Lake and newer (older laptops with discrete
       | GPUs might have had it, too).
       | 
       | Handy tip: if your hub is DP 1.4 and drives multiple monitors
       | then it's most likely using a Synaptics MST hub to do this
       | (almost all non-Thunderbolt ones do and many Thunderbolt ones as
       | well) and Synaptics provides a very very little known diagnostics
       | tool called vmmdptool (available in the Windows Store). It
       | doesn't replace a full DP AUX protocol analyzer of course but
       | it's free and for that price it's really handy.
       | 
       | This topic is dear to me because I have _fixed the USB C
       | specification_ related to this and allow me to be damn proud of
       | that: it used to erroneously say the USB data speed in this mixed
       | functionality was limited to 5gbps but it is not, the limit is
       | 10gbps. https://superuser.com/a/1536688/41259
       | 
       | Ps.: When I say Thunderbolt, I am well aware of how Thunderbolt 4
       | is just USB4 with optional features made mandatory. It's not
       | relevant to the discussion at hand.
       | 
       | Pps.: DisplayPort is the native video standard for USB C,
       | C-DisplayPort adapters and cables are basically passive because
       | they just need to tell the host how to configure the connector
       | lanes. Meanwhile all USB C - HDMI cables and converters are
       | active which constantly work on the DP signal to become HDMI.
       | DisplayPort++ alas is not implemented with the USB C connector.
       | For this reason if any compatibility issues arise it's always
       | better to connect a USB C device to the DisplayPort input on a
       | monitor. A HDMI alternate mode was defined in the past but it
       | remained paper only and it has been declared dead this year at
       | CES.
        
         | someplaceguy wrote:
         | What is the benefit of adding compression here? Does that mean
         | that the bandwidth might or might not be sufficient depending
         | on which pictures are being shown on the screen?
         | 
         | If so, that doesn't seem very reliable and if not, what's the
         | point of compression?
        
           | chx wrote:
           | > DSC is most often used in a Constant Bit Rate (CBR) mode,
           | where the number of coded bits generated for a slice is
           | always a constant that is generally equal to the bit rate
           | multiplied by the number of pixels within the slice
           | 
           | https://vesa.org/wp-content/uploads/2020/08/VESA-DSC-
           | Source-...
        
             | someplaceguy wrote:
             | Ah, so basically this is typically used as lossy
             | compression with a constant bit rate, which makes the
             | bandwidth usage predictable/constant but depending on the
             | pictures being shown (and the noise level?), it can lead to
             | some visual degradation (which they say is usually
             | imperceptible).
             | 
             | That's interesting and surprising to me.
        
               | chx wrote:
               | If you consider how ~1:18 is considered visually lossless
               | for x.264, a compression of 1:3 is very very likely can
               | be achieved visually lossless indeed.
        
               | jamiek88 wrote:
               | Is it not compressing an already compressed image though?
        
         | emilyst wrote:
         | > Ps.: When I say Thunderbolt, I am well aware of how
         | Thunderbolt 4 is just USB4 with optional features made
         | mandatory. It's not relevant to the discuss at hand.
         | 
         | Dear God, I hope this situation settles down in the near
         | future. As it is I have years of USB-C-looking cables that all
         | do different things but are visually indistinguishable.
        
           | privacyking wrote:
           | If you get a thunderbolt 4 cable it can do everything, at
           | least until the 80gbps one comes out
        
           | chx wrote:
           | I wish the manufacturers would just adopt the USB IF
           | marketing names and logos. https://i.imgur.com/H3unbD5.png
           | would be a lot simpler.
           | 
           | I also wish the USB IF defined colors for high speed lanes
           | absent vs 5/10/20 gbps capable high speed lanes and then
           | 60W/100W/240W power. All it needed were two color bands on
           | the plastic part of the plug. If colors are too gaudy then go
           | Braille-style, have a 2x2 grid on top and bottom of the plug
           | where bit 0 is a little hole and bit 1 is a little bump.
           | That's 16 possible data speeds and 16 possible power levels
           | and so far we have only needed 4 for data and 3 for power.
           | 
           | Intel could've added a separate row for Thunderbolt.
        
       | bbx wrote:
       | The reason why I switched to using the DisplayPort is that it's
       | the one port that supports higher refresh rates ont my Dell
       | monitors. I didn't realise it was more "friendly" than HDMI
       | (which comes in various quality standards).
        
       | nfriedly wrote:
       | I'm annoyed nvidia for putting the current generation of HDMI on
       | their recent GPUs, but leaving them with an outdated version of
       | DisplayPort.
       | 
       | For a long time, my advice to anyone was to always choose
       | DisplayPort whenever it was an option. But now that has to have
       | the caveat of "if you have a new high-end GPU and a fancy high
       | refresh rate monitor, HDMI might actually be better for your
       | situation"
        
         | 111111IIIIIII wrote:
         | And only 1 new version HDMI port but 3 old version DP ports. I
         | use a dual display setup but HDR only works on mu displays with
         | HDMI.
        
         | ThatPlayer wrote:
         | Yeah I hate this too because I just want to have more video
         | outputs. My Valve Index doesn't like to be hotswapped plugged
         | in, requiring a reboot. With 1440p144hz monitors, I just barely
         | cannot run 2 of them (2x14Gbit) over a single DP1.4 (26Gbit)
         | using MST. Windows will automatically drop colour down to 4:2:2
         | if I try it.
         | 
         | Not that DP2.0 MST hubs exist yet afaik, but when they do I'd
         | have to get a new GPU. Which I guess is Nvidia's goal.
        
         | gjsman-1000 wrote:
         | > I'm annoyed nvidia for putting the current generation of HDMI
         | on their recent GPUs, but leaving them with an outdated version
         | of DisplayPort.
         | 
         | That was due to unfortunate timing where HDMI had the
         | specifications ready before DisplayPort did.
        
           | Night_Thastus wrote:
           | I thought it was because their G-Sync modules had not been
           | updated to support the new DP? That was what I heard.
        
           | Sweepi wrote:
           | AMD's RX 7000 support Display port 2.0 and were released 2
           | months later then Nvidia's Ada. Afaik DP 2.0 has been
           | finished in 2019(!).
        
         | bdavbdav wrote:
         | I also noticed that many boards are 3xHDMI+1xDP now. My
         | previous card was 3DP 1HDMI
        
       | LoveMortuus wrote:
       | I personally prefer HDMI or DVI, I don't like DisplayPort because
       | on the display that I have, when I turn off the display the
       | computer detects it as if I've physically disconnected the
       | display, which then moves all the windows that I had open around
       | and breaks my macros...
        
         | qubitcoder wrote:
         | While not specific to DisplayPort, I've been really impressed
         | with the AV Access 8K KVM Switch [1]. 4K at 120Hz without
         | chroma subsampling was a hard requirement.
         | 
         | I use it with macOS Ventura and a Windows 11 desktop. It works
         | nicely in conjunction with a Dell Thunderbolt 3 dock to power
         | an LG OLED and support additional accessories. And it has EDID
         | emulation, which is crucial for maintaining consistent display
         | arrangement.
         | 
         | I've tried other DisplayPort KVM switches with mixed success.
         | This is the first one that's worked out of the box.
         | 
         | [1] https://www.avaccess.com/products/8ksw21-kvm/
         | 
         | [2] https://www.amazon.com/dp/B0BPRZPFM6?
        
         | zerkten wrote:
         | I /think/ this is related to the EDID support or configuration.
         | 
         | What I do know is that you need specific EDID support when
         | using a DisplayPort KVM because that switches the computer away
         | on a regular basis. If you have multiple screens and a single
         | screen KVM (fairly common) it will do the re-arrangement that
         | you've experienced. With EDID support it keeps both output
         | alive resulting in no re-arrangement.
         | 
         | EDIT: It seems the correct term may be "EDID emulation".
        
           | tivert wrote:
           | > If you have multiple screens and a single screen KVM
           | (fairly common) it will do the re-arrangement that you've
           | experienced. With EDID support it keeps both output alive
           | resulting in no re-arrangement.
           | 
           | Can you tell me some KVM switches that have a working
           | implementation of that feature? I'm looking for one.
           | 
           | I'm still using a DVI KVM, because when I tried to upgrade to
           | Displayport (years ago), I ran into the problem you describe
           | and gave up.
        
             | zerkten wrote:
             | I have a TESmart (TES-HDK0402A1U-USBK) from https://www.ama
             | zon.com/gp/product/B08259QL5J/ref=ppx_yo_dt_b... which
             | works fine in this regard. This was probably the 4th
             | DisplayPort KVM I tried due to keyboard issues with the
             | others.
        
               | dogline wrote:
               | I'm looking too, but that one linked has HDMI out only,
               | with DP and HDMI in. Don't you lose the benefit of DP
               | with this setup?
        
         | vel0city wrote:
         | That's a failure of the display or your GPU, not necessarily
         | the port. I've got several monitors with DisplayPort on a few
         | different GPUs that don't do this behavior. Its not something
         | inherent with DisplayPort.
        
         | Aaargh20318 wrote:
         | Sounds like an OS problem. When I do that it moves all open
         | windows to the remaining display, but when I turn the other
         | display back on it just moves them back. No special software
         | installed for this, just stock macOS Ventura.
        
           | Perz1val wrote:
           | It's not an OS problem, that's how DP is supposed to work and
           | that's in fact stupid
        
             | Aaargh20318 wrote:
             | It's an OS problem for not handling monitors being
             | disconnected and reconnected properly.
             | 
             | What _is_ stupid is having half your windows unreachable
             | because they are on a monitor that is turned off. How does
             | that help anyone?
             | 
             | Imagine how annoying it would be on a laptop? I use my
             | laptop in a meeting, and have a few windows open on its
             | screen, then I arrive at my desk and plug in my 2 external
             | monitors and keep my laptop lid closed. Imagine if the
             | windows on the laptop display stayed there, they would be
             | unreachable and would have to manually move them to my
             | other screens every time I moved from a meeting to my desk.
             | Same for the other way around, I disconnect from my
             | monitors and take my laptop to a meeting, and then I can't
             | access any of the dozens of windows I had open on my 2
             | external monitors.
             | 
             | That would be an absolutely brain dead way of working.
        
               | theamk wrote:
               | You are talking about physical monitor unplugging, while
               | others are talking about monitor power off, but still
               | plugged in.
               | 
               | I agree that when monitor is physically removed, the
               | windows should go back.
               | 
               | But if it is just powered off but still plugged in? I say
               | windows should stay there, just like HDMI did. For
               | example I sometimes turn off monitors for less
               | distractions during regular conversations.. or to save
               | power before leaving.. in this case windows jumping all
               | over the screen are the last thing I want.
        
               | drbawb wrote:
               | >Imagine how annoying it would be on a laptop?
               | 
               | >That would be an absolutely brain dead way of working.
               | 
               | For starters: the experience I want on my laptop is not
               | the same as the experience I want on my desktop. I have
               | two monitors at home, sometimes I like to turn off the
               | wing-monitor when I am watching a movie on the center
               | screen. (To avoid distraction, and also minimize glare,
               | etc.) That doesn't mean I want those windows rearranged:
               | that's probably going to interfere w/ the movie that is
               | playing, fuck up my desktop wallpapers, icons, window
               | sizes, etc. The whole point of turning off the monitor is
               | to _hide those distractions._
               | 
               | Also not all window managers suck as badly as Mac OS and
               | Windows. By default I have odd-numbered virtual desktops
               | go to the left montior, and even-numbered virtual
               | desktops go to the right monitor. If I want to move a
               | viewport to another monitor, I renumber it accordingly,
               | and all the windows move to that monitor: complete w/
               | their proportional positions and sizes.
               | 
               | The idea that _a device hotplug event_ would change how
               | my virtual desktops are laid out is so absurd to me that
               | I switched operating systems to avoid the default Windows
               | behavior. So maybe consider that paradigms and workflows
               | other than "a docked laptop" exist before calling people
               | braindead?
        
               | Aaargh20318 wrote:
               | > that's probably going to interfere w/ the movie that is
               | playing, fuck up my desktop wallpapers, icons, window
               | sizes, etc
               | 
               | That sounds like a Windows problem.
        
               | hot_gril wrote:
               | Makes sense to move the windows if you close a laptop
               | lid, but not if you turn off a monitor. You can't
               | disconnect the laptop screen, but you can easily
               | disconnect the monitor.
               | 
               | I have a cheap laptop that doesn't understand a turned-
               | off monitor as unusable, and it's actually way nicer that
               | way.
        
               | Aaargh20318 wrote:
               | Why is it nicer? What is the use case for having
               | unreachable windows?
        
               | Joker_vD wrote:
               | So that I don't see them while they still stay where they
               | are! For example, preparing for a slide-show is one
               | example: move PowerPoint to the second screen where it'd
               | stay even if there is actually no second screen.
        
               | Aaargh20318 wrote:
               | But my windows aren't getting messed up, once I turn the
               | monitor back on it restores the previous state.
               | 
               | And the use case is simple: I have 2 monitors hooked up
               | to my machine but I don't always need 2 monitors. I have
               | a 34" 5k2k ultrawide and a 27" 4k in portrait mode. When
               | I'm coding or using my computer for an extended amount of
               | time I turn both on, but when I just want to quickly
               | write an e-mail I only turn on the main monitor. I mostly
               | use the ultrawide and have the portrait monitor to the
               | side to dump documentation and other materials I need for
               | quick reference on. Right now it's 29oC in my room and I
               | don't want to turn on more equipment than needed.
        
               | hot_gril wrote:
               | This seems like a very specific situation, and you could
               | disconnect the second monitor for it.
               | 
               | Not sure about the reliability of window restoration.
               | It's at least better in the latest macOS than in older
               | versions.
        
               | Aaargh20318 wrote:
               | Dive under my desk and unplug the monitor from the
               | Thunderbolt 4 hub or just press the power button.
        
               | hot_gril wrote:
               | Not having your windows get messed up when you turn the
               | monitor on/off, which is especially relevant for TVs and
               | projectors. On the flip side, what's the use case for
               | connecting to a powered-off monitor and not using it? If
               | I don't want to use a monitor, I won't connect it.
        
             | hot_gril wrote:
             | Is this specific to DP? I thought HDMI was the same.
        
           | LoveMortuus wrote:
           | But when I turn off my displays that are on HDMI or DVI or
           | VGA, the computer still detects them like normal, like
           | they're connected, because they are.
           | 
           | I don't understand in which case the 'act like it's
           | physically disconnected' behaviour would be more desired than
           | what we had with all the standards before.
           | 
           | I have read that some DisplayPort displays do have an option
           | in the settings to disable this behaviour.
        
             | Aaargh20318 wrote:
             | > But when I turn off my displays that are on HDMI or DVI
             | or VGA, the computer still detects them like normal, like
             | they're connected, because they are.
             | 
             | But since they're off, that doesn't make any sense, now you
             | can't reach any windows on those monitors. I think the
             | macOS behavior make the most sense, move them so you can
             | access them, but move them back once the display is turned
             | on.
        
               | Joker_vD wrote:
               | Or, you know, don't do any of those silly dances and just
               | let the windows stay where they are the whole time
               | between turning the displays off and on?
               | 
               | Reminds me of that problem with video driver update on
               | Windows when the screen is momentarily resized down to
               | 1024x768 resolution and then instantly goes back to
               | 2560x1440: all the non-maxed windows get shrunk down and
               | shifted to the upper-left corner (so they would be
               | visible on a 1024x768 screen) and then they just stay
               | like this. It's totally useless and actually quite
               | annoying.
        
               | someplaceguy wrote:
               | Sounds like a Windows problem once again...
        
               | kiririn wrote:
               | This nonsense is solved in Windows 11
        
               | throw0101c wrote:
               | > _But since they're off, that doesn't make any sense,
               | now you can't reach any windows on those monitors._
               | 
               | Perhaps I want to turn off the displays to darken the
               | room.
               | 
               | I don't want anything done with the programs/windows
               | except not look at them.
        
       | dsab wrote:
       | I hate HDMI, I had many problems with this interface, in my
       | prvious job my monitor was turning off when I was getting up from
       | chair, same was happening in my home with differeng PC, cables
       | and monitor. Such a thing never happend to me when I changed
       | interface to Display Port
        
       | BadBadJellyBean wrote:
       | I just wish there was only one connector. Either HDMI or DP. I
       | don't care.
        
         | bluedino wrote:
         | Oh, but there's a third. MiniDP. I have a pile of useless
         | cables/adapters now, since Apple started using them but
         | replaced them with USB-C back in 2015.
        
           | cassianoleal wrote:
           | There is also mini and micro HDMI.
           | 
           | https://www.howtogeek.com/745530/hdmi-vs-mini-hdmi-vs-
           | micro-...
        
           | CyberDildonics wrote:
           | Oh, but that's not a different video interface.
        
         | mikepurvis wrote:
         | I've always been surprised there aren't more receptacles out
         | there that accept either a DisplayPort or HDMI plug. I've only
         | seen it on one motherboard, but in retrospect, it seems obvious
         | that the DisplayPort plug was designed to facilitate such a
         | thing.
        
           | yonatan8070 wrote:
           | Do you happen to know what the model of the motherboard was?
           | I've never heard of a receptacle that accepts both HDMI and
           | DP
        
             | mikepurvis wrote:
             | Unfortunately I don't, and it's a tricky thing to Google
             | for, but I believe it was something industrial where space
             | was at a premium (I work in robotics).
        
           | Kirby64 wrote:
           | Are you sure you're not thinking of DP++ ?
           | https://www.kensington.com/news/docking-connectivity-
           | blog/di...
           | 
           | Its a displayport connector, but the port itself can become
           | an HDMI or DVI port purely with a passive adapter.
        
         | jayd16 wrote:
         | It'll be the USB-C port.
        
           | masklinn wrote:
           | For laptops it's pretty clearly USB-C with DP alt mode, just
           | way too convenient to use a display as a charging hub.
           | 
           | Desktop CG manufacturers don't care and it doesn't really
           | drive sales (NVidia has completely removed it, and AMD's
           | tends to be pretty buggy, it's really just DP over a USB-C
           | port) so you need conversion add-in cards (usually from your
           | mobo manufacturer).
           | 
           | Also display manufacturers, standard connectivity seems to be
           | 2xHDMI 1xDP and a USB-C if you're lucky (with a few cool
           | oddballs providing 2xDP 1xHDMI instead). Pretty hard to do
           | all-USB if that means you can't plug more than one machine to
           | your display.
        
             | immibis wrote:
             | Many connectors in one. Is the video output the left port
             | or the right one? How about the charging port?
        
               | masklinn wrote:
               | Just make all ports fully capable?
        
           | Salgat wrote:
           | DisplayPort and HDMI's max length is 15m, compared to USB
           | 3.1's 3m.
        
             | jsheard wrote:
             | That very much depends on how much bandwidth you're pushing
             | through the cable, older versions of HDMI could tolerate
             | long cable runs but HDMI 2.1 is also limited to about 3m
             | unless you use an expensive active optical cable.
             | DisplayPort 2.0/2.1 is taking forever to come to market but
             | it's going to be similarly limited in its high bandwidth
             | modes when it does arrive.
        
               | latchkey wrote:
               | They aren't that expensive... 100' is $72 on Amazon.
               | 
               | https://www.amazon.com/Highwings-48Gbps-Dynamic-
               | Compatible-D...
        
               | puzzlingcaptcha wrote:
               | 40G FRL5 (4K @ 144Hz) seems to work just fine for me with
               | a 8m HDMI 2.1-certified cable (copper). But I had trouble
               | finding longer ones from the same vendor which makes me
               | suspect it is getting close to the limit.
        
           | BadBadJellyBean wrote:
           | That's okay as well as long as it's consistent
        
           | FinnKuhn wrote:
           | and USB-C allows the transmission of displayport over USB-C
        
             | jsheard wrote:
             | It also allows the transmission of HDMI over USB-C, so you
             | still have two competing standards
        
               | Zekio wrote:
               | no vendors implemented it, so whenever you are currently
               | doing hdmi over usb-c it is actually hdmi over dp over
               | usb-c
        
               | johnwalkr wrote:
               | Which is fine and should stay well-supported because HDMI
               | is also here to stay. The trend for TVs is to include
               | HDMI ports but no displayports.
        
               | babypuncher wrote:
               | The problem is these two competing standards solve
               | overlapping but distinct problem sets. DisplayPort was
               | designed to be used for computer monitors, while HDMI was
               | designed for home theaters.
               | 
               | This may not seem like a very meaningful distinction to
               | most people, but it would become readily apparent to
               | anyone trying to design a home theater around DisplayPort
               | instead of HDMI. Off the top of my head, DisplayPort
               | lacks any equivalents for ARC, Auto Lipsync, and CEC.
               | Odds are your home theater makes use of at least two of
               | these features, even if you don't realize it.
        
               | hadrien01 wrote:
               | There's also Thunderbolt over USB-C that does
               | DisplayPort, but it doesn't support all the same versions
               | as DisplayPort over USB-C. And there's also USB 4 but I
               | don't know/understand if it changes anything.
        
               | masklinn wrote:
               | I think the most applicable bit is that USB4 adds
               | tunneling modes, aside from alternate mode.
               | 
               | I'm not entirely clear, but from my understanding
               | alternate mode means the physical connection gets
               | switched over, while tunneling means the tunneled data is
               | sent over USB (so the communication on the wire is USB
               | all along, you get nesting).
        
               | jsmith45 wrote:
               | This is essentially correct, although the encapsulation
               | format is really the Thunderbolt encapsulation format,
               | which is only USB in that it is now officially defined in
               | the USB4 specification.
               | 
               | USB4 hubs/docks for example, need to have the ability to
               | translate from encapsulated DisplayPort to alternate mode
               | for its downstream ports. USB4 hosts need to support both
               | encapsulated and alternate modes in the downstream ports.
               | The idea is that if a display device does not want to
               | implement any other non USB 2.0 peripherals (2.0 has
               | dedicated lines so it can support those), it can
               | implement only alternate mode, (and not need to support
               | the complexities of encapsulating USB4), plus all USB4
               | hosts needing to support DisplayPort means you know you
               | can connect such a screen to any USB4 device and have it
               | work (although supporting multiple screens like this is
               | optional).
               | 
               | One thing to not though is that DisplayPort alternate
               | mode has the option of reversing the device->host wires
               | of USB-C lanes and thus get 4 lanes of host->device data,
               | for 80Gbps at Gen 3 speeds if using both lanes.
               | 
               | USB4 V1.0 does not support lane reversal, and USB4 V2.0
               | can reverse only one bidirectional lane, since it still
               | needs to support device->host data. I think this lane
               | reversal is only possible when using the new new Gen 4
               | speeds, which provided 80Gbps symmetric, or 120/40 Gbps
               | asymmetric.
        
               | vel0city wrote:
               | It allows it, but can you actually buy any products for
               | it?
               | 
               | https://www.notebookcheck.net/The-demise-of-HDMI-over-
               | USB-C-...
               | 
               | > True USB-C to HDMI adapters are no longer going to be a
               | thing. The HDMI Alt Mode is more or less history, and
               | DisplayPort has won. Notebookcheck spoke to HDMI LA and
               | the USB-IF about it.
               | 
               | > HDMI LA said that it doesn't know of a single adapter
               | that has ever been produced. Similarly, at the USB
               | Implementers Forum (USB-IF), people who are familiar with
               | the certification process have yet to see a true USB-C to
               | HDMI adapter.
        
               | bandrami wrote:
               | There's an oscilloscope screen that does the conversion
               | in software and was the bane of my existence last year.
        
               | ianburrell wrote:
               | There is also MHL alt mode. The HDMI alt mode was
               | horrible because it used the whole USB-C cable as HDMI
               | cable preventing any other users. MHL is HDMI but over a
               | single pair.
               | 
               | My understanding is that all the USB-C to HDMI adapters
               | are using DisplayPort because that is more widely
               | supported by devices. And the conversion chips are just
               | as cheap as MHL to HDMI.
        
               | [deleted]
        
               | jdiff wrote:
               | The article mentions the death of HDMI altmode. Is it
               | actually widely supported?
        
               | jsheard wrote:
               | It seems not, I missed that HDMI altmode is still stuck
               | at HDMI 1.4 and they're not making any attempt to bring
               | it up to HDMI 2.0. DisplayPort indeed won in that case.
        
           | EVa5I7bHFq9mnYK wrote:
           | I recently switched DisplayPort connecting my laptop to the
           | monitor for USB-C, didn't see any difference, except now I
           | get video (4kx60hz, max for my monitor), audio, can connect
           | mouse and keyboard directly to the monitor, and the cable is
           | thinner and lighter. Am I missing on something?
        
             | deergomoo wrote:
             | Don't forget power delivery! Having a truly one-cable
             | docking setup is a godsend if you need to switch between
             | devices regularly.
        
           | jsight wrote:
           | I just wish the Google Pixel line supported display out over
           | USB-C.
        
       | gaudat wrote:
       | I really really hope that my LG has a DisplayPort input. They do
       | offer a monitor that is the same size and I believe the same
       | panel, but the price increase does not really justify it. So sad
       | that I have to bear with the hell of DP/USBc to HDMI
       | converters...
        
       | binkHN wrote:
       | I didn't realize Multi-Stream Transport (MST) requires OS
       | support, and I was surprised to find out MacOS, with its great
       | Thunderbolt support, does not support this. "Even" ChromeOS can
       | do MST.
        
         | stephenr wrote:
         | Technically macOS _does_ support MST. But it _only_ supports it
         | to stitch together for a single display. It does _not_ support
         | daisy chaining two displays.
         | 
         | Thankfully, every Mac for the last 7 years has Thunderbolt3 at
         | least, so getting dual-4K-display from a single port/cable is
         | still very doable, you just need a TB3 to dual DisplayPort or
         | HDMI adapter.
        
           | deergomoo wrote:
           | You can daisy chain the Studio Display and Pro Display, and
           | you could daisy chain the old Thunderbolt Display. Is that
           | using some custom thing over Thunderbolt rather than MST?
        
           | bpye wrote:
           | Well except for some of the Apple Silicon machines. The M1
           | (and maybe M2?) only have two video output blocks, of which
           | one is already used for the internal display. It's honestly
           | the biggest complaint I have about my M1 MBP. Yes DisplayLink
           | or whatever it's called exists but the performance is bad.
        
             | stephenr wrote:
             | Ugh, right. Probably a freudian slip that I just mentally
             | pretend those configs didn't come after half a decade of
             | ubiquitous multi-display support on Macs.
        
             | stephenr wrote:
             | But - and I can't believe I forgot this in my other reply -
             | this is one thing that really grinds my gears about Apple's
             | releases since 2018, on everything except the Mac Pro (both
             | 2019 and 2023).
             | 
             | They hard-code one DisplayPort stream to *something* other
             | than Thunderbolt.
             | 
             | On laptops, they hard-code one via eDP to the display,
             | which is useless if it's in clamshell mode.
             | 
             | On Mac Mini, Mac Studio, and the MacBook Pro with HDMI
             | port, one stream from the GPU is hard-coded to the HDMI
             | port. If you want maximum displays, always has to be from
             | HDMI.
             | 
             | But neither the 2019 or 2023 Mac Pro have this limitation.
             | Even on the 2019 model where the HDMI ports were physically
             | on the card - they could route all video streams via system
             | TB3 ports.
             | 
             | I just checked and the base M2 Mini, and the M2 Pro MBP
             | seem to finally allow using two video streams over the TB4
             | ports, - but the M2 Ultra Studio, with the exact same SoC
             | as the M2 Ultra Mac Pro, still has this stupid artificial
             | limit.
        
             | jon-wood wrote:
             | I assume you got one of the original M1 MBPs. The more
             | recent models have more display blocks. M1 Pro can drive
             | two external displays, and I think the M1 Max can do three
             | or four. I'm still slightly pissed Apple ever shipped the
             | original M1 MBPs, they were horrible machines.
        
               | spockz wrote:
               | Although I wished i would have gotten the 15" and a
               | bigger hard drive, I'm still very happy with my first gen
               | 13" mbp. Also the fact that I can just plug in usb
               | c/Thunderbolt monitors and the screen just instantly
               | displays and windows reconfigure without flickering is
               | amazing.
        
               | bpye wrote:
               | It is really my only complaint about it, and since I
               | needed a new laptop at the time it still made sense. I
               | would definitely rather one of the 14" ones now - but not
               | enough to buy a new device.
        
         | bdavbdav wrote:
         | Yep. Super annoying. I have a Dell WD22TB4 which works great to
         | drive 3 monitors for everything, except my Mac.
        
       | rbanffy wrote:
       | I was _very_ disappointed that HDMI transfers a constant bit-rate
       | stream of pixels - VGA, but digital. I expected that, given most
       | display devices can at least hold a full frame in-memory, that
       | the stream could be limited to the parts of the image that
       | changes, allowing higher frame rates when less than the full
       | frame changes.
       | 
       | That, and generating a signal could be much simpler than bit-
       | banging on a DAC.
        
       | hooverd wrote:
       | For a brief moment, being able to daisy-chain displays was so
       | cool. Now it feels like we've regressed to wrangling HDMI cables
       | out of a USB-C hub.
        
       | qwertox wrote:
       | This is a really well written article. If you haven't bothered
       | reading about DisplayPort because you knew VGA, somewhat knew DVI
       | and thought that HDMI is the culmination of it all, where then
       | DisplayPort is just some further kind of evolution, this article
       | does a really good job at explaining how DP is very different and
       | something new, something worth knowing about.
       | 
       | The core sentence which made me actually read the article was
       | "DisplayPort sends its data in packets." and does a good job at
       | explaining what this means and how this differs from HDMI.
        
         | ben0x539 wrote:
         | As a habitual comments skimmer, thanks for selling me on the
         | article. :)
        
       | Ajedi32 wrote:
       | > A carefully shaped money-backed lever over the market is
       | absolutely part of reason you never see DisplayPort inputs on
       | consumer TVs, where HDMI group reigns supreme, even if the TV
       | might use DisplayPort internally for the display panel
       | connection.
       | 
       | Curious about this. Is the HDMI standards group engaging in anti-
       | competitive behavior to prevent DisplayPort from taking over on
       | TVs? I've always assumed it was just momentum.
        
         | msie wrote:
         | I was just thinking about USB-C vs Lightning... What if the EU
         | were to mandate HDMI over DisplayPort?
        
           | Dr4kn wrote:
           | The EU Generally chooses the more open and cheaper
           | alternative. They did this with phone charging, car charging
           | and would probably do the same with display connectors.
           | 
           | USB C is the cheaper and more capable connector
        
         | sedatk wrote:
         | > Is the HDMI standards group engaging in anti-competitive
         | behavior
         | 
         | They don't need to, the whole industry is behind it:
         | 
         | > The HDMI founders were Hitachi, Panasonic, Philips, Silicon
         | Image, Sony, Thomson, and Toshiba.
         | 
         | > HDMI has the support of motion picture producers Fox,
         | Universal, Warner Bros. and Disney, along with system operators
         | DirecTV, EchoStar (Dish Network) and CableLabs.
        
         | 6D794163636F756 wrote:
         | [dead]
        
       | sylware wrote:
       | Don't forget, displayport is worldwide royalty free, not hdmi
       | (similar issue than mpeg/av1, arm|x86/risc-v, etc).
        
       | amelius wrote:
       | Can we please get monitors that can find their signal within
       | 100ms?
       | 
       | Can we please get monitors that communicate their orientation
       | (portrait/landscape etc.) to the computer?
        
         | jwells89 wrote:
         | The Apple Studio Display autodetects rotation and lets the
         | connected computer know so it can adjust the picture
         | accordingly, but it connects over Thunderbolt. It should be
         | possible for monitors to do this over USB-C too, but I'm not
         | sure about plain DisplayPort or HDMI.
        
           | theodric wrote:
           | _sigh_
           | 
           | Ok FINE I will move the goalpost
           | 
           | I want all that in a monitor that doesn't cost more than a
           | typical modal monthly income in Europe
        
           | deergomoo wrote:
           | > it connects over Thunderbolt
           | 
           | Not exclusively--it supports regular DP Alt Mode, though no
           | idea if those advanced features also work.
        
         | deergomoo wrote:
         | > Can we please get monitors that can find their signal within
         | 100ms?
         | 
         | TVs too--with various HDR standards, VRR etc becoming more
         | popular these days I quite often find myself staring at a blank
         | screen for 5-10s
        
           | amelius wrote:
           | With Smart TVs being the only option these days, I'm thinking
           | about building my own TV from a computer monitor.
        
       | wcfields wrote:
       | If we lived in a just, and virtuous world we'd all be using cheap
       | coaxial cables with BNC connectors via whatever version of SDI is
       | the highest bit rate.
        
         | mschuster91 wrote:
         | SDI can't do _any_ communication between the source and sink.
         | You can do a limited form if you use two SDI cables (one
         | upstream, one downstream) which is what stuff like BlackMagic
         | 's bi-directional micro converter does, but IIRC that's only
         | used for talkback to the camera operator and HDMI CEC messages
         | which is how they implement things like their ATEM Mini
         | consoles triggering recording - not for DDC-style format and
         | framerate negotiations.
         | 
         | Also, device manufacturers don't do SDI on consumer devices
         | because SDI is by definition unencrypted and uncompressed, so
         | it's at odds with HDCP.
         | 
         | [1] https://www.blackmagicdesign.com/products/microconverters
        
           | ShadowBanThis01 wrote:
           | BMD has done some innovative things through their SDI port,
           | specifically enabling camera control with an Arduino-add-on
           | board.
           | 
           | Someone finally did what I've wanted to see for years: built
           | a follow-focus unit that controls the focusing motors in the
           | lenses, so you don't have to bolt a ridiculous contraption
           | (and janky focusing-ring adapters) onto a lens to turn its
           | (non-mechanical) focusing ring manually.
        
             | mschuster91 wrote:
             | What I want to see is someone dump the FPGA in these Micro
             | Converters and hack it to add an USB interface. The Micro
             | Converters are _all_ just the same in the interior: SerDes
             | units and clock /redrivers on the I/O ports and an FPGA
             | that additionally has an USB connection which is used to
             | power the converter but _could_ also be used to do other
             | things.
             | 
             | That FPGA should be powerful enough to do a lot of
             | interesting things - anything from fooling around with
             | commands (like with the Arduino board) to image
             | manipulation (e.g. a watermark embed).
        
         | NavinF wrote:
         | That would suck. I'd need at least 4 coax cables for each
         | monitor.
         | 
         | Btw DP cables are often twinax and eDP cables are often
         | microcoax.
         | 
         | In a just, and virtuous world we'd all be using cheap SMF
         | cables with LC connectors
        
         | formerly_proven wrote:
         | 12G-SDI of 2015 is the first SDI revision to surpass DVI from
         | the 90s in terms of bandwidth. 24G-SDI, which seems to only
         | exist on paper so far, has about one quarter the bandwidth of a
         | DisplayPort 2.0 link.
         | 
         | It's unsurprisingly annoying trying to compete with a bundle of
         | cables using only one wire.
        
       | spearman wrote:
       | > A carefully shaped money-backed lever over the market is
       | absolutely part of reason you never see DisplayPort inputs on
       | consumer TVs, where HDMI group reigns supreme
       | 
       | What does this actually mean?
        
         | green-salt wrote:
         | HDMI is licensed, Displayport doesn't need one. I'm sure
         | there's money changing hands in the TV sector of most display
         | manufacturers.
        
       | Veliladon wrote:
       | HDMI predates DP by 5 years, switching wouldn't enable any
       | specific use case, and that 5 years head start means HDMI has
       | market inertia.
       | 
       | Yes we should use it because it's better but the practicality?
       | There's so much gear that's HDMI that would have to be replaced
       | or require new active adaptors. It's so much industry and
       | consumer effort for such a marginal gain.
        
         | lucideer wrote:
         | Forget all the other pros/cons - MST makes it worth any
         | industry & consumer effort alone.
        
           | hot_gril wrote:
           | How many people use multiple external displays? For the power
           | users who do, there are already good enough solutions.
        
             | lucideer wrote:
             | I don't have data on this but if my anecdata is anything to
             | go by, power users aren't your typical multi-monitor
             | users...
        
         | patrickthebold wrote:
         | My monitor is fairly old, but it only does 30Hz on the hdmi
         | cable and 60Hz on the display port cable. I'm not sure the
         | exact versions. I think its displayport 1.2 and HDMI < 2. So
         | for me, it was super important to get display port out.
        
         | babypuncher wrote:
         | HDMI also supports a lot of home theater specific features that
         | DisplayPort does not.
         | 
         | DisplayPort never could have replaced HDMI because DisplayPort
         | never tried to solve the same problems HDMI did.
        
         | metaphor wrote:
         | > _HDMI predates DP by 5 years..._
         | 
         | - HDMI v1.0 initial release = 2002-12-09
         | 
         | - DisplayPort v1.0 initial release = 2006-05-01
         | 
         | To be sure, just under 3.5 years; HDMI rolled into v1.3 a month
         | after DisplayPort v1.0 saw the light of day. Agreed on the
         | impact of market inertia.
        
           | Dalewyn wrote:
           | HDMI's marketshare has nothing to do with its headstart: HDMI
           | is backed by the entire entertainment industry and was
           | designed for televisions, while DisplayPort is backed by VESA
           | and other computer hardware manufacturers and was designed
           | for computer monitors.
           | 
           | For every computer monitor there are hundreds if not probably
           | thousands of televisions.
           | 
           | HDMI's marketshare has everything to do with who the players
           | involved are and just how much weight they have to throw
           | around. Even computer hardware generally have more HDMI ports
           | than DisplayPort ports.
        
       | Sparkyte wrote:
       | my argument display-port of usb-c is a better interface for
       | flexibility and utility
        
       | mschuster91 wrote:
       | > in fact, the first time we mentioned eDP over on Hackaday, was
       | when someone reused high-res iPad displays by making a
       | passthrough breakout board with a desktop eDP socket. Yes, with
       | DisplayPort, it's this easy to reuse laptop and tablet displays -
       | no converter chips, no bulky adapters, at most you will need a
       | backlight driver.
       | 
       | I'm a bit confused by the linked project's PCB [1] - if all
       | that's needed is a backlight driver, why all the differential
       | pairs between the microcontroller and the eDP FFC?
       | 
       | [1]
       | https://hackaday.io/project/369/gallery#9e4a0fef705befb8030b...
        
         | bpye wrote:
         | The schematics are available [0]. It looks to me like the MCU
         | is doing some sideband control, but the DisplayPort signal
         | itself is passed through unmodified.
         | 
         | [0] https://github.com/OSCARAdapter/OSCAR
        
       | crazygringo wrote:
       | > _Just like most digital interfaces nowadays, DisplayPort sends
       | its data in packets. This might sound like a reasonable
       | expectation, but none of the other popular video-carrying
       | interfaces use packets in a traditional sense - VGA, DVI, HDMI
       | and laptop panel LVDS all work with a a stream of pixels at a
       | certain clock rate._
       | 
       | Funny, a stream of data at a constant rate makes much more sense
       | to me intuitively than packets, specifically for uncompressed
       | video.
       | 
       | Are there any downsides to packetization, like increased latency
       | or dropped frames or anything? Or not really, is it all upsides
       | in being able to trivially combine multiple data streams or
       | integrate easily into hubs?
        
         | wtallis wrote:
         | > Funny, a stream of data at a constant rate makes much more
         | sense to me intuitively than packets, specifically for
         | uncompressed video.
         | 
         | Sure, until you start trying to design the transceivers and
         | realize that supporting two or three fixed standard data rates
         | is a lot simpler than supporting a continuously-variable clock
         | speed. Every other high-speed digital interface operates at
         | just a few discrete speeds: SATA/SAS, PCIe, Ethernet, USB.
         | 
         | The fact that DVI and HDMI were such a shallow digitization of
         | VGA's racing-the-beam meant features like variable refresh rate
         | (Gsync/Freesync) showed up far later than they should have. If
         | we hadn't wasted a decade using CRT timings (and slight
         | modifications thereof) to drive LCDs over digital links, it
         | would have been more obvious that the link between GPU and
         | display should be negotiated to the _fastest_ data rate
         | supported by both endpoints rather than the lowest data rate
         | sufficient to deliver the pixels.
        
           | ansible wrote:
           | > _... If we hadn 't wasted a decade using CRT timings (and
           | slight modifications thereof) to drive LCDs over digital
           | links ..._
           | 
           | Don't forget the decade or so (late 1990's to early 2000's)
           | where we were driving LCDs over _analog_ links (VGA
           | connectors).
           | 
           | I had purchased a pair of Silicon Graphics 1600SW monitors
           | back in the day, which required a custom Number Nine graphics
           | card with an OpenLDI display interface. It was many years
           | since those were introduced to the market that DVI finally
           | started becoming commonplace on PC graphics cards.
           | 
           | Using the mass-market LCD monitors in the late 1990's was a
           | frustrating affair, where you had to manually adjust the
           | timing synchronization of the analog signal.
        
         | samtho wrote:
         | > Funny, a stream of data at a constant rate makes much more
         | sense to me intuitively than packets, specifically for
         | uncompressed video.
         | 
         | There is a data rate floor for how fast the device that outputs
         | must meet. We've surpassed this (due to optimization at the
         | silicon level, designing hardware who's sole job it is to send
         | bursts of data) and we end up running out of data in the buffer
         | periodically because it's just so fast. Because analog is real
         | time, you can squeeze much else in that data stream but with
         | digital, we are afforded the luxury of packet switching instead
         | of that line being idle, we can pump even more down the linen.
         | 
         | > Are there any downsides to packetization, like increased
         | latency or dropped frames or anything? Or not really, is it all
         | upsides in being able to trivially combine multiple data
         | streams or integrate easily into hubs?
         | 
         | If I recall correctly, the timing and data rates is all
         | prearranged based on reported capacity and abilities of the
         | receiving device and it won't even attempt to support multiple
         | streams if it is incapable of doing so or the data channel
         | established cannot fully support the bandwidth required.
        
         | tverbeure wrote:
         | Latency is limited to the amount of buffering that's present in
         | the interface logic (on sides). In the case of regular DP,
         | there's just a few FIFOs and pipeline stages, so the latency is
         | measured in nanoseconds.
         | 
         | When using display stream compression (DSC), there's a buffer
         | of 1 line, and a few FIFOs to handle rate control. At the
         | resolution for which DSC is used (say, 4K/144Hz), the time to
         | transmit a single line is around 3us. So that's the maximum
         | additional latency you can expect.
        
       | askura wrote:
       | The only downside with DP is just devices that actually have that
       | versus HDMI / Mini hdmi.
       | 
       | This post though has made me annoyed as DP is clearly the better
       | standard.
        
         | nfriedly wrote:
         | Agree with you there. I'd be much happier if my Raspberry Pis,
         | cheap USB-C hubs, cheap monitors, etc. all had DP instead of
         | HDMI.
         | 
         | It's extra annoying when you realize that you're paying more
         | for royalties and sometimes additional hardware (e.g. in a
         | USB-C hub that uses DP internally but converts to HDMI for the
         | output) just to get an inferior interface.
        
       ___________________________________________________________________
       (page generated 2023-07-11 23:00 UTC)