Post ALOd7kiXCSrR9Ob99s by wolf480pl@mstdn.io
(DIR) More posts by wolf480pl@mstdn.io
(DIR) Post #ALOac0GfeF98ClIY8e by wolf480pl@mstdn.io
2022-07-11T22:20:01Z
0 likes, 0 repeats
Why is it the GPU that generates sync signals and not the monitor?The GPU has a separate framebuffer and CRTC for each monitor anyway. Meanwhile if a monitor wants to switch between inputs, it has to go blank until (I presume) it syncs up with the new input...
(DIR) Post #ALOan9x5yyf7QRSghU by icedquinn@blob.cat
2022-07-11T22:22:13.410776Z
0 likes, 0 repeats
@wolf480pl :comfythink: that is how it works with audio interfaces.
(DIR) Post #ALOatVs5OslN909Faq by icedquinn@blob.cat
2022-07-11T22:23:21.651252Z
0 likes, 0 repeats
@wolf480pl if i had to guess its because you actually render to a buffer and then trigger a page flip. the monitor can't know when the buffer is "done" since it doesn't do the work, the GPU does, and the GPU knows when a frame is done.for an audio interface and DSP code the interface owns the control and asks you when its ready for more bytes.
(DIR) Post #ALOb4sRBq99CJjzE36 by sunbearshaman@noagendasocial.com
2022-07-11T22:25:25Z
0 likes, 0 repeats
@wolf480pl The sync signal keeps the GPU and the monitor in sync. What you're proposing makes no sense, how could a device sync to itself? Sync necessarily means between two things. Any one thing is trivially in sync with itself.
(DIR) Post #ALObG5LYBWapXVh2FU by wolf480pl@mstdn.io
2022-07-11T22:22:19Z
0 likes, 0 repeats
Also imagine how cheap it'd be to composite video from multiple sources into one picture if the sources could take external sync signals. I think that's what TV broqdcasters (used to) do, but not with customer-grade equipment :(
(DIR) Post #ALObGevvsFUXuTz5Jg by wolf480pl@mstdn.io
2022-07-11T22:23:13Z
0 likes, 0 repeats
@icedquinn you mean the DAC generates clock?
(DIR) Post #ALObHyvl0i4gDgX6Qa by wolf480pl@mstdn.io
2022-07-11T22:27:46Z
0 likes, 0 repeats
@icedquinn but the monitor won't acceot vsync other than 60Hz (or a few other refresh rates from the support list). You can't just emit vsync whenever you're done rendering a frame. You either have to wait for the CRTC to emit vsync ("vsync on" in game settings) or continue anyway and draw over a buffer that's currently being scanned out (tearing) or do triple buffering
(DIR) Post #ALObfAd1gTWfA0pRsO by icedquinn@blob.cat
2022-07-11T22:31:58.953885Z
1 likes, 0 repeats
@wolf480pl > You can't just emit vsync whenever you're done rendering a frame.:comfyeyes: freesync begs to differ
(DIR) Post #ALObrL9iMzgfHjvCVc by wolf480pl@mstdn.io
2022-07-11T22:29:45Z
0 likes, 0 repeats
@sunbearshaman well the monitor would send the sync signals to the GPU. Hsync to tell it to start sending next line, vsync to tell it.to start sending next frame.
(DIR) Post #ALOcGGP3B2xxmhylPc by sunbearshaman@noagendasocial.com
2022-07-11T22:38:41Z
0 likes, 0 repeats
@wolf480pl What's the use of a sync signal for a GPU when all it can do is go as fast as it can? If the monitor wanted data faster, tough shit. The sync signal has to come from the producer, the monitor is a consumer. It just doesn't make sense the other way.
(DIR) Post #ALOcStv25GnHodxrTE by wolf480pl@mstdn.io
2022-07-11T22:35:30Z
0 likes, 0 repeats
@icedquinn well ok but freesync/Gsync/VRR is a recent thing, most monitors don't have it, and for most of the time GPUs existed, no monitors had it. So there was plenty of time to try sending sync backwards
(DIR) Post #ALOd5zKJuZj3sjQlma by wolf480pl@mstdn.io
2022-07-11T22:42:25Z
0 likes, 0 repeats
Now that I think of it, it's a bad idea. At 4k60Hz you have ~600 MHz pixel clock, that means a pixel is 50cm long on the wire (ignoring the fact that light in wire is slower).If you're sending sync and color in the same direction, that's not a problem, they get delayed the same amount. But if you're sendinf sync backwards, a 2m cable has 8 pixels of round-trip time, and you'd need some link training to compensate for that.
(DIR) Post #ALOd7CS2bHSusOcUjI by LovesTha@floss.social
2022-07-11T22:44:26Z
0 likes, 0 repeats
@wolf480pl Because the technology came across from TV where the sync needs to be part of the broadcast I would presume.
(DIR) Post #ALOd7kiXCSrR9Ob99s by wolf480pl@mstdn.io
2022-07-11T22:46:27Z
1 likes, 0 repeats
@sunbearshaman the GPU can't send vsync as fast as it renders frames. With normal monitors, the monitor requires refresh rate to be 60, 120 or 144Hz and if you deviate from that too far it will refuse to disllay the signal. VRR monitors (freesync/gsync) allow for more deviation but it's a fairly new thing.In most cases, CRTC, which is the part of GPU that generates sync signals, does so at a constant rate, and the render process has to wait for it (or risk broken frames)
(DIR) Post #ALOd88B1ygJzuR6jGi by wolf480pl@mstdn.io
2022-07-11T22:48:13Z
0 likes, 0 repeats
@LovesTha yeah, I think so too. When you're broadcasting the same signal to multiple receivers there needs to be one sync for all of them, so it's natural that the source would generate it. Also it plays well with propagation delays, as the sync gets delayed by the same amount as the pixel data
(DIR) Post #ALOdkBXFwImBKUeHvE by LovesTha@floss.social
2022-07-11T22:51:07Z
0 likes, 0 repeats
@wolf480pl Which also goes to being able to hook up multiple monitors to the same PC feed. (Think a classroom setup, which can be handy)
(DIR) Post #ALOdl0OUqd4Z45wVqi by wolf480pl@mstdn.io
2022-07-11T22:53:36Z
0 likes, 0 repeats
@LovesTha I don't think I've ever seen a passive splitter for connecting multiple monitors to the same source, but I know there are video wall controllers which I guess are active splitters and they provably don't want to have a frame buffer
(DIR) Post #ALOeLyrpUIP12nIImO by LovesTha@floss.social
2022-07-11T22:56:50Z
0 likes, 0 repeats
@wolf480pl They were only a thing for analogue monitors. EG https://www.gadgets4geeks.com.au/27cm-vga-hdb15-male-to-female-dual-monitor-splitter-cable?msclkid=2a46a33e34651237867dc1f29e1938af
(DIR) Post #ALOhDHv7ZxsrOONeca by vertigo@hackers.town
2022-07-11T23:01:54Z
0 likes, 0 repeats
@LovesTha @wolf480pl There's also the issue of preference as well. One monitor might prefer a 1280*800 resolution display while another might prefer 2560*1600. You'd think it's a simple matter of scaling the video, but it's not. Horizontally, yes, you can scale, but vertical scaling requires the video source to rescan rasters. Thus, if your video card can only handle the lower resolution, it will become incompatible with the high resolution monitor. You'll need to buy new video cards (or, at least, video scan converters; vis-a-vis "flicker fixers" in the Commodore-Amiga market) every time you change monitors, which isn't terribly economical.
(DIR) Post #ALOhDINTsWZKoLeIpE by wolf480pl@mstdn.io
2022-07-11T23:30:57Z
1 likes, 0 repeats
@vertigo @LovesTha how does having sync generated at the GPU help in this case? Sure, the GPU can choose any resolution it likes, without consulting monitor's support list in EDID. But then the monitor will just refuse to display it. And if the GPU was to negotiate a mutually supported resolution through EDID, it could as well ask the monitor for a specific res in the backward sync scenario.
(DIR) Post #ALOhrKfilfv7VfGuqO by wolf480pl@mstdn.io
2022-07-11T23:36:46Z
0 likes, 0 repeats
@vertigo @LovesTha btw. I like how you used 16:10 aspect ratio
(DIR) Post #ALOpx86hCE6puYAFg8 by apparentlymart@mastodon.online
2022-07-12T01:05:35Z
0 likes, 0 repeats
@wolf480pl syncing one source to another as video passes through has been proven in the past though: Amigas with genlocks in the days of standard definition analog, and NeTV with HDMI more recently. Relies on all but one of the sources being able to sync to an external signal as you say, though.
(DIR) Post #ALPICIBenZf0EZVGz2 by benis@cawfee.club
2022-07-12T06:28:34.005972Z
0 likes, 0 repeats
@wolf480pl probably a tradition that originated in the days of CRTs out of necessity
(DIR) Post #ALPp5zBroNk24DCazw by wolf480pl@mstdn.io
2022-07-12T12:32:01Z
1 likes, 0 repeats
@roboneko @icedquinn @LovesTha Not sure about the advanced HDMI features but the core of it is same as DVI, which is basically VGA with a TMDS ser/des for each of the R/G/B channels insteead of DAC/ADC.
(DIR) Post #ALPphZ8Z7pXuGhZrKy by wolf480pl@mstdn.io
2022-07-12T12:38:41Z
0 likes, 0 repeats
@roboneko @icedquinn @LovesTha Also, for CRTs, all that matters is that the CRT's sweep oscillators are locked to the same hsync/vsync signals as the source's sweep oscillators. It doesn't matter which one generates the sync signals, as long as they're close enough that there isn't a significant propagation delay (which is not the case for broadcast TV).
(DIR) Post #ALPsoWECdhGD8TDmVM by wolf480pl@mstdn.io
2022-07-12T13:15:04Z
1 likes, 0 repeats
@roboneko @icedquinn @LovesTha yeah, but then in the TV studio you don't have a single video source, you have many of them, all genlocked to common sync signals. Though as I later learnt, even at such short distances the propagation delays are significant so video mixers need adjustable delays on each input...
(DIR) Post #ALPwKN4JXNsos2mFpA by xue@bae.st
2022-07-12T13:58:15.071865Z
0 likes, 0 repeats
@wolf480pl so it would be one-directional communicationalso hdmi is just vga but advanced, so it inherits all the quirks from it
(DIR) Post #ALRPwkSC8ExeEPYino by wolf480pl@mstdn.io
2022-07-13T07:04:47Z
0 likes, 0 repeats
@roboneko @icedquinn @LovesTha https://en.m.wikipedia.org/wiki/Analog_delay_line
(DIR) Post #ALRblmx7hS6iFJZ9m4 by wolf480pl@mstdn.io
2022-07-13T09:16:10Z
0 likes, 0 repeats
@roboneko @icedquinn @LovesTha obtuse! I feel vicariously insulted on behalf of all the analog engineers
(DIR) Post #ALRdmrFJQFwpl9iAeO by wolf480pl@mstdn.io
2022-07-13T09:34:45Z
1 likes, 0 repeats
@roboneko @icedquinn @LovesTha I'd think they'd call it "cursed", "arcane", "inscrutable", but not "obtuse" or "dull"
(DIR) Post #ALRfq8sUWWyvDljNlQ by wolf480pl@mstdn.io
2022-07-13T09:59:58Z
1 likes, 0 repeats
@roboneko really? I looked it up in 3 dictionaries and all of them said it meant sth like "intelectually unsophisticated", "not smart", etc. but I'm not a native english speaker...