https://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen Stack Exchange Network Stack Exchange network consists of 178 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers. Visit Stack Exchange [ ] Loading... 1. 2. 0 3. +0 4. + Tour Start here for a quick overview of the site + Help Center Detailed answers to any questions you might have + Meta Discuss the workings and policies of this site + About Us Learn more about Stack Overflow the company + Business Learn more about hiring developers or posting ads with us 5. 6. Log in Sign up 7. current community + Super User help chat + Meta Super User your communities Sign up or log in to customize your list. more stack exchange communities company blog Super User is a question and answer site for computer enthusiasts and power users. It only takes a minute to sign up. Sign up to join this community [ano] Anybody can ask a question [ano] Anybody can answer [an] The best answers are voted up and rise to the top Super User Sponsored by Sponsored logo * 1. Home 2. 1. Public 2. Questions 3. Tags 4. Users 5. Unanswered 6. Find a Job 7. Jobs 8. Companies 3. Teams Stack Overflow for Teams - Collaborate and share knowledge with a private group. [teams-illo-free-si] Create a free Team What is Teams? 1. Teams 2. Create free Team Teams Q&A for work Connect and share knowledge within a single location that is structured and easy to search. Learn more Transatlantic ping faster than sending a pixel to the screen? Ask Question Asked 9 years, 7 months ago Active 2 years ago Viewed 172k times 851 279 John Carmack tweeted, I can send an IP packet to Europe faster than I can send a pixel to the screen. How f'd up is that? And if this weren't John Carmack, I'd file it under "the interwebs being silly". But this is John Carmack. How can this be true? To avoid discussions about what exactly is meant in the tweet, this is what I would like to get answered: How long does it take, in the best case, to get a single IP packet sent from a server in the US to somewhere in Europe, measuring from the time that a software triggers the packet, to the point that it's received by a software above driver level? How long does it take, in the best case, for a pixel to be displayed on the screen, measured from the point where a software above driver level changes that pixel's value? --------------------------------------------------------------------- Even assuming that the transatlantic connection is the finest fibre optics cable that money can buy, and that John is sitting right next to his ISP, the data still has to be encoded in an IP packet, get from the main memory across to his network card, from there through a cable in the wall into another building, will probably hop across a few servers there (but let's assume that it just needs a single relay), gets photonized across the ocean, converted back into an electrical impulse by a photosensor, and finally interpreted by another network card. Let's stop there. As for the pixel, this is a simple machine word that gets sent across the PCI express slot, written into a buffer, which is then flushed to the screen. Even accounting for the fact that "single pixels" probably result in the whole screen buffer being transmitted to the display, I don't see how this can be slower: it's not like the bits are transferred "one by one" - rather, they are consecutive electrical impulses which are transferred without latency between them (right?). networking graphics-card ip bandwidth Share Improve this question Follow edited Nov 26 '19 at 9:36 [674] ECrownofFire 10333 bronze badges asked May 1 '12 at 9:30 [fdd] Konrad RudolphKonrad Rudolph 7,47166 gold badges2626 silver badges3535 bronze badges 40 * 52 Either he's crazy or this is an unusual situation. Due to the speed of light in fiber, you cannot get data from the US to Europe in less than about 60 milliseconds one way. Your video card puts out an entire new screen of pixels every 17 milliseconds or so. Even with double buffering, you can still beat the packet by quite a bit. - David Schwartz May 1 '12 at 9:38 * 90 @DavidSchwartz: You're thinking of the GPU in isolation. Yes, the GPU can do a whole lot of work in less than 60ms. But John is complaining about the entire chain, which involves the monitor. Do you know how much latency is involved, from the image data is transmitted to the monitor, and until it is shown on the screen? The 17ms figure is meaningless and irrelevant. Yes, the GPU prepares a new image every 17 ms, and yes, the screen displays a new image every 17 ms. But that says nothing about how long the image has been en route before it was displayed - jalf May 1 '12 at 9:59 * 26 He's a game programmer, and he said faster than I can send a pixel to the screen... so perhaps account for 3D graphics rendering delay? Though that should be quite low in most video games; they optimise for performance, not quality. And of course, there's the very high chance he's just exaggerating (there, I stated the obvious, happy?). - Bob May 1 '12 at 10:51 * 24 Go to Best Buy some time and watch all the TV sets, where they have them all tuned to the same in-house channel. Even apparently identical sets will have a noticeable (perhaps quarter-second) lag relative to each other. But beyond that there's having to implement the whole "draw" cycle inside the UI (which may involve re-rendering several "layers" of the image). And, of course, if 3-D rendering or some such is required that adds significant delay. - Daniel R Hicks May 1 '12 at 11:43 * 5 There is a lot of room for speculation in question, I don't think there is a perfect answer unless you know what J.Carmack was really talking about. Maybe his tweet was just some stupid comment on some situation he encountered. - Baarn May 1 '12 at 12:09 | Show 35 more comments 3 Answers 3 Active Oldest Votes 1384 The time to send a packet to a remote host is half the time reported by ping, which measures a round trip time. The display I was measuring was a Sony HMZ-T1 head mounted display connected to a PC. To measure display latency, I have a small program that sits in a spin loop polling a game controller, doing a clear to a different color and swapping buffers whenever a button is pressed. I video record showing both the game controller and the screen with a 240 fps camera, then count the number of frames between the button being pressed and the screen starting to show a change. The game controller updates at 250 Hz, but there is no direct way to measure the latency on the input path (I wish I could still wire things to a parallel port and use in/out Sam instructions). As a control experiment, I do the same test on an old CRT display with a 170 Hz vertical retrace. Aero and multiple monitors can introduce extra latency, but under optimal conditions you will usually see a color change starting at some point on the screen (vsync disabled) two 240 Hz frames after the button goes down. It seems there is 8 ms or so of latency going through the USB HID processing, but I would like to nail this down better in the future. It is not uncommon to see desktop LCD monitors take 10+ 240 Hz frames to show a change on the screen. The Sony HMZ averaged around 18 frames, or 70+ total milliseconds. This was in a multimonitor setup, so a couple frames are the driver's fault. Some latency is intrinsic to a technology. LCD panels take 4-20 milliseconds to actually change, depending on the technology. Single chip LCoS displays must buffer one video frame to convert from packed pixels to sequential color planes. Laser raster displays need some amount of buffering to convert from raster return to back and forth scanning patterns. A frame-sequential or top-bottom split stereo 3D display can't update mid frame half the time. OLED displays should be among the very best, as demonstrated by an eMagin Z800, which is comparable to a 60 Hz CRT in latency, better than any other non-CRT I tested. The bad performance on the Sony is due to poor software engineering. Some TV features, like motion interpolation, require buffering at least one frame, and may benefit from more. Other features, like floating menus, format conversions, content protection, and so on, could be implemented in a streaming manner, but the easy way out is to just buffer between each subsystem, which can pile up to a half dozen frames in some systems. This is very unfortunate, but it is all fixable, and I hope to lean on display manufacturers more about latency in the future. Share Improve this answer Follow edited May 3 '12 at 16:48 [289] Peter Mortensen 11.7k2222 gold badges6868 silver badges8989 bronze badges answered May 1 '12 at 14:24 [61f] John CarmackJohn Carmack 6,87211 gold badge1111 silver badges33 bronze badges 8 * 234 I'd like to not have to lock this answer for excessive off-topic comments. We're all thrilled that John provided this answer, but we don't need 25 comments all expressing their gratitude, disbelief, or excitement. Thank you. - nhinkle May 2 '12 at 8:48 * 32 Your USB trigger is probably running as a Low speed USB device (bus frames at 125usec) causing a minimal 8ms delay (hardware issue). Maybe try a PS2 keyboard instead ? - Boris May 2 '12 at 9:10 * 8 @Marcus Lindblom by hunt for, you mean read? I think in this case, how he got to his number is just as important as the number - the skepticism regarding the tweet is not going to be addressed by citing another number. Also the context helps - he was most directly annoyed by this specific monitor with its sub-optimal software. - Jeremy May 3 '12 at 11:54 * 16 It sounds like you are saying that when LCD makers claim say, a 5ms response time, that may be time time it takes the raw panel to change, but the monitor adds quite a bit more time buffering and processing the signal before it actually drives the LCD. Doesn't that mean the manufacturers are publishing false/ misleading specs? - psusi May 3 '12 at 18:19 * 13 @psusi doubledeej.blogspot.com/2009/07/... zdnet.com/blog/ou/... gizmodo.com/5669331/why-most-hardware-specs-are-total-bullshit maximumpc.com/article/features/display_myths_shattered - Dan Is Fiddling By Firelight May 4 '12 at 12:48 | Show 3 more comments 72 Some monitors can have significant input lag Accounting for an awesome internet connection compared to a crappy monitor and video card combo its possible Sources: Console Gaming: The Lag Factor * Page 2 So, at 30FPS we get baseline performance of eight frames/133ms, but in the second clip where the game has dropped to 24FPS, there is a clear 12 frames/200ms delay between me pulling the trigger, and Niko beginning the shotgun firing animation. That's 200ms plus the additional delay from your screen. Ouch. A Display can add another 5-10ms So, a console can have upto 210ms of lag And, as per David's comment the best case should be about 70ms for sending a packet Share Improve this answer Follow edited Jul 30 '12 at 7:34 [b2c] mmdemirbas 37911 gold badge55 silver badges1818 bronze badges answered May 1 '12 at 10:26 [d83] AkashAkash 3,75011 gold badge1818 silver badges3333 bronze badges 6 * 1 -1 I don't think that John Carmack uses a crappy monitor or video card. Please reference your claim with credible sources. - Baarn May 1 '12 at 10:41 * 14 Sorry but I still don't see this really answering the question. The quote tells about "pulling the trigger" and this implies much more work, as in input processing, scene rendering etc., than just sending a pixel to the screen. Also, human reaction speed is relatively lousy compared to modern hardware performance. The time between the guy thinking he pulled the trigger, and actually pulling it, could well be the bottleneck. - Konrad Rudolph May 1 '12 at 10:57 * 2 The linked article shows that the author of this analysis purchased a special device that can show you exactly when the button was pressed, so I don't think they're just winging the numbers. - Melikoth May 1 '12 at 13:40 * 13 @KonradRudolph: Perception is pretty weird stuff. I read an article a while ago about an experimental controller that read impulses directly off the spinal cord. People would feel that the computer was acting before they had clicked, even though it was their own nerve command to click it was reacting to. - Zan Lynx May 1 '12 at 16:48 * 12 @Zan Lynx: This is a known effect. Google for "Benjamin Libet's Half Second Delay". Human consciousness requires significant processing time. Everything that think is happening now actually happened in the past. All your senses are giving you an "integrated multi-media experience" of an event from half a second ago. Furthermore, events appear to be "time stamped" by the brain. A direct brain stimulation has to be delayed relative to a tactile stimulation in order for the subject to report the sensations as simultaneous! - Kaz May 1 '12 at 21:24 | Show 1 more comment 41 It is very simple to demonstrate input lag on monitors, just stick an lcd next to a crt and show a clock or an animation filling the screen and record it. One can be a second or more behind. It is something that LCD manufacturers have tightened up on since gamers, etc have noticed it more. Eg. Youtube Video: Input Lag Test Vizio VL420M Share Improve this answer Follow edited Jul 30 '12 at 7:34 [b2c] mmdemirbas 37911 gold badge55 silver badges1818 bronze badges answered May 3 '12 at 10:31 [7d1] JamesRyanJamesRyan 1,64111 gold badge1212 silver badges2020 bronze badges 0 Add a comment | Highly active question. Earn 10 reputation (not counting the association bonus) in order to answer this question. The reputation requirement helps protect this question from spam and non-answer activity. Not the answer you're looking for? Browse other questions tagged networking graphics-card ip bandwidth or ask your own question. The Overflow Blog * Podcast 395: Who is building clouds for the independent developer? * Exploding turkeys and how not to thaw your frozen bird: Top turkey questions... Featured on Meta * Now live: A fully responsive profile * Reducing the weight of our footer Linked 27 Does USB version speed matter for input devices? 2 Spec for spec speed differences between specific and generic hardware setups 1 What does the delay pinging localhost represent? Related 3 How does a computer tell a video card what to display? 4 Installing QuoteRoom and TradeStation on separate PCs 1 What drawbacks are there when using memory that is faster than what is rated by the motherboard? 385 Ping faster than light 8 Is there a way to optimize encrypted packet traffic over multiple Internet connections? 35 Why is ping faster through VPN than without a VPN? 3 Why is ping so much faster than UDP sockets? 0 ping google.com not sending any response 0 How are IP adresses resolved to phone numbers? Hot Network Questions * Convert prefix to infix * How to ask friend not to leave so quickly, or at least let me know if he doesn't plan on staying very long? * Identifying Motherboard and CPU * How well do power calculations actually work in reality? * Notation Question - Augmented vs. Diminished for Accidental * How to select and delete a column of text in emacs? * Is this chord from Bach's C major Prelude G+ or Cm? * Hot water tank loses heat by closed loop gravitational circulation - need to improve thermosiphon * SOQL SubQuery Parent-to-child-Junction Returns Child Records in Query Editor but not Apex. Any Idea why? * Is en changing to em or eg only a thing in Attic? * Do these islands exist or are they technical artifacts? * Feasibility of Velocisharks * Open source FCI solvers for exact diagonalization of the many-electron Hamiltonian * American astronaut abducted from moon, placed in alien zoo, must work with time-traveling Roman soldier * how to extend this animation to infinity * Story about voluntary time dilation * How many images of Didymos could be transmitted by Dart between the first full size image and the impact? * Minimum number of runways required for international airports? * Is there a way to generate a data-driven Monte Carlo sample from a histogram? * When and why did English stop pronouncing 'hour' with an [h] like its spelling still shows? * What does crontab do when I give no argument? * Is there a way to requeue bounced mails in Postfix? * How constrain 2d slider in Manipulate to a specified region? * How to compensate noise at the output of logic gates? more hot questions Question feed Subscribe to RSS Question feed To subscribe to this RSS feed, copy and paste this URL into your RSS reader. [https://superuser.co] * Super User * Tour * Help * Chat * Contact * Feedback * Mobile Company * Stack Overflow * For Teams * Advertise With Us * Hire a Developer * Developer Jobs * About * Press * Legal * Privacy Policy * Terms of Service * Cookie Settings * Cookie Policy Stack Exchange Network * Technology * Culture & recreation * Life & arts * Science * Professional * Business * API * Data * Blog * Facebook * Twitter * LinkedIn * Instagram site design / logo (c) 2021 Stack Exchange Inc; user contributions licensed under cc by-sa. rev 2021.11.25.40831 Super User works best with JavaScript enabled [p] Your privacy By clicking "Accept all cookies", you agree Stack Exchange can store cookies on your device and disclose information in accordance with our Cookie Policy. Accept all cookies Customize settings