https://jmmv.dev/2023/06/fast-machines-slow-machines.html # Julio Merino * About * Essays * Resume * Software * Archive * Series * Tags [ ] [search] Fast machines, slow machines June 27, 2023 * About 16 minutes * Tags: featured, opinion, twitter-thread Well, that was unexpected. I recorded a couple of crappy videos in 5 minutes, posted them on a Twitter thread, and went viral with 8.8K likes at this point. I really could not have predicted that, given that I've been posting what-I-believe-is interesting content for years and… nothing, almost-zero interest. Now that things have cooled down, it's time to stir the pot and elaborate on those thoughts a bit more rationally. To summarize, the Twitter thread shows two videos: one of an old computer running Windows NT 3.51 and one of a new computer running Windows 11. In each video, I opened and closed a command prompt, File Explorer, Notepad, and Paint. You can clearly see how apps on the old computer open up instantly whereas apps on the new computer show significant lag as they load. I questioned how computers are actually getting better when trivial things like this have regressed. And boom, the likes and reshares started coming in. Obviously some people had issues with my claims, but there seems to be an overwhelming majority of people that agree we have a problem. To open up, I'll stand my ground: latency in modern computer interfaces, with modern OSes and modern applications, is terrible and getting worse. This applies to smartphones as well. At the same time, while UIs were much more responsible on computers of the past, those computers were also awful in many ways: new systems have changed our lives substantially. So, what gives? The original comparison Let's address the elephant in the room first. The initial comparison I posted wasn't fair and I was aware of that going in. That said, I knew repeating the experiment "properly" would yield the same results, so I plowed ahead with whatever I had right then. The videos were unplanned because the idea for the Tweets came to mind when I booted the old machine, clicked on Command Prompt, and was blown away by the immediacy to start the app. The original comparison videos showed: * An AMD K7-600 with 128MB of RAM and a 5400 RPM HDD running Windows NT 3.51. This was a machine from the year 1999-2000 with an OS that was about 5 years older than it. Hardware was experiencing really fast improvements back then, particularly in CPU speeds, and you were kinda expected to keep up with the 2-year upgrade treadmill or suffer from incredibly slowness. All this is to say that this machine was indeed overpowered for the OS I used. Please remind me how we are moving forward. In this video, a machine from the year ~2000 (600MHz, 128MB RAM, spinning-rust hard disk) running Windows NT 3.51. Note how incredibly snappy opening apps is. pic.twitter.com/YEO824vIqI -- Julio Merino (@jmmv) June 22, 2023 * A Surface Go 2 with an Intel Core m3 CPU, 8GB of RAM, and an SSD running Windows 11. This is a 3-year old machine that shipped with Windows 10, but Windows 11 is officially supported on it--and as you know, that means you are tricked into upgrading. This is not a powerful machine by any means, but: first, it's running the verbatim Microsoft experience, and second, it should be much more powerful than the K7 system, shouldn't it? We are continuously reminded that any computer or phone today has orders of magnitude more power than past machines. Now look at opening the same apps on Windows 11 on a Surface Go 2 (quad-core i5 processor at 2.4GHz, 8GB RAM, SSD). Everything is super sluggish. pic.twitter.com/W722PNEGv0 -- Julio Merino (@jmmv) June 22, 2023 Oh, and yes, I quoted the wrong hardware specs in the original tweet. Looking again on how I made that mistake: I searched for "Surface Go 2" in Bing, I landed on the "Surface Laptop Go 2" page, and copied what I saw there without noticing that it wasn't accurate. All apps had been previously open, so they should all have been comfortably sitting in RAM. The better comparison Obviously various people noticed that there was something off with my comparison (unfair hardware configurations, wrong specs), so I redid the comparison once the thread started gaining attention: * Windows 2000 on the K7-600 machine (see installation thread). This is an OS from 1999 running on hardware from that same year. And, if you ask me, this was the best Windows release of all times: super-clean UI on an NT core, carrying all of the features you would want around performance and stability (except with terrible boot times). As you can see, things still fare very well for the old machine in terms of UI responsiveness. For those thinking that the comparison was unfair, here is Windows 2000 on the same 600MHz machine. Both are from the same year, 1999. Note how the immediacy is still exactly the same and hadn't been ruined yet. pic.twitter.com/Tpks2Hd1Id -- Julio Merino (@jmmv) June 23, 2023 * Windows 11 on a Mac Pro 2013 (see installation instructions) with a 6-core Xeon E5-1650v2 at 3.5GHz, 32GB of RAM, dual GPUs, and an SSD that can sustain up to 1GB/s. I know, this is a 10-year old machine at this point running a more modern OS. But please, go ahead, tell me with a straight face how hardware with these specs cannot handle opening trivial desktop applications without delay. I'll wait. Oh, and one more thing. Yes, yes, the Surface Go 2 is underpowered and all you want. But look at this video. Same steps on a 6-core Mac Pro @ 3.5GHz with 32GB of RAM. All apps cached. Note how they get painted in chunks. It's not because of animations or mediocre hardware. pic.twitter.com/ 9TOGAdaTXO -- Julio Merino (@jmmv) June 23, 2023 The reason I used the Mac Pro is because it is the best machine I have running Windows right now and, in fact, it's my daily driver. But again, I do not care about how running this comparison on an "old" machine might be "inaccurate". Back when I left Microsoft last year, I was regularly using a Z4 desktop from 2022, a maxed-out quad-core i7 ThinkPad with 32GB of RAM, and an i7 Surface Laptop 3 with 16GB of RAM. Delays were shorter on these, of course, but interactions were still noticeably slow. So, in any case: I agree the original comparison was potentially flawed, but as you can see, a better comparison yields the same results--which I knew it would. After years upon years of computer usage, you gain intuition on how things should behave, and trusting such intuition tends to work well as long as you validate your assumptions later, don't get me wrong! Computer advancements Let's put the tweets aside and talk about how things have changed since the 2000s. I jokingly asked how we are "moving forward" as an industry, so it's worth looking into it. Indeed, we have moved forward in many aspects: we now have incredible graphics and high-resolution monitors, super-fast networks, real-time video editing, and much more. All of these have improved over the years and it is very true that these advancements have allowed for certain life transformations to happen. Some examples: the ability to communicate with loved ones much more easily thanks to great-quality videoconferencing; the ability to have a streaming "cinema at home"; and the painless switch to remote work during the pandemic^1. We have also moved forward on the I/O side. Disk I/O had always been the weakest spot on past systems. Floppy disks were unreliable and slow. CDs and DVDs were slightly-more-reliable but also slow. HDDs were the bottleneck for lots of things: their throughput improved over time, allowing things like higher-resolution video editing and the like, but random I/O hit physical limits--and fast random I/O is what essentially drives desktop responsiveness. Then, boom, SSDs appeared and started showing up on desktops. These were a game-changer because they fixed the problem of random I/O. All of a sudden, booting a computer, launching a heavy game, opening folders with lots of small photos, or simply just using your computer … all improved massively. It's hard to explain the usability improvements that these brought if you did not live through this transition, and it's scary how those improvements are almost gone; more on that later. Other stuff also improved, like the simplicity to install new hardware, the pervasiveness of wireless connections and devices, the internationalization of text and apps (Unicode isn't easy nor cheap, I'll grant that)… all providing more usable machines in more contexts than ever. So yeah, things are better in many areas and we have more power than ever. Otherwise, we couldn't do things like ML-assisted photo processing on a tiny phone, which was unimaginable in the 2000s. Terrible latency Yet… none of these advancements justify why things are as excruciatingly slow as they are today in terms of UI latency. Old hardware from the year 1999, combined with an OS from that same year, shows that responsive systems have existed^2. If anything, all these hardware improvements I described should have made things better, not worse, shouldn't they? Some replied to the comparison telling me that graphical animations and bigger screens are "at fault" because we have to draw more pixels, and thus the fact that we have these new niceties means we have to tolerate slowness. Well, not quite. Witness for yourself: And... one more thing? To those saying: "it's the higher 4K resolution!" or "it's the good-looking animations!" or "it's the pretty desktop background!"--no, they aren't at fault. See, the slowness is still visible with all of these disabled. In the end... blog post coming soon. pic.twitter.com/9BQy6IpK6a -- Julio Merino (@jmmv) June 26, 2023 GPUs are a commodity now, and they lift the heavy burden of graphics management from the CPU. The kinds of graphical animations that a desktop renders are extremely cheap to compute, and this has been proven by macOS since its launch: all graphical effects on a macOS desktop feel instant. The effects do delay interactions though--the desktop switching animation is particularly intrusive, oh god how I hate that thing--but the delays generally come from intentional pauses during the animation. And when the effects introduce latency because the GPU cannot keep up, such as when you attach a 4K display to a really old Mac, then it's painfully obvious that animations stutter due to lack of power. I haven't encountered the latter in any of the videos above though, which is why animations and the like have nothing to do with my concerns. So, please, think about it with a critical mind. How is the ability to edit multiple 4K video streams in real time or the ability to stream a 4K movie supposed to make starting apps like Notepad slower? Or opening the context menu in the desktop? Or reading your email? The new abilities we acquired much more power from the CPU and GPU, but they shouldn't remove performance from tasks that are essentially I/O-bound. Opening a simple app shouldn't be slower than it was more than 20 years ago; it really shouldn't be. Yet here we are. The reasons for the desktop latency come from elsewhere, and I have some guesses for those. But first, a look at a couple of examples. Examples On Windows land, there are two obvious examples I want to bring up and that were mentioned in the Twitter thread: * Notepad had been a native app until very recently, and it still opened pretty much instantaneously. With its rewrite as a UWP app, things went downhill. The before and after are apparent, and yet… the app continues to be as unfeatureful as it had always been. This is extra slowness for no user benefit. * As for Windows Terminal, sure, it is nicer than anything that came before it, but it is visibly much, much heavier than the old Command Prompt. And if you add PowerShell into the mix, we are talking about multiple seconds for a new terminal window to be operational unless you have top-of-the-line hardware. macOS fares better than Windows indeed, but it still has its issues. See this example contributed by @AlexRugerMusic. Even the mighty M1 has trouble opening up the system settings app: Another example: Left: 2006 MBP (2GHz Core 2 Duo, 2GB DDR2 RAM, Max OS X 10.6.8; has SSD) Right: 2021 MacBook Air (M1, 16GB RAM, macOS 13) Apps open about twice as fast on the Pro as they do on the Air (most, not just System Preferences/Settings). pic.twitter.com/ PjMX1DI4uz -- rewgs (@AlexRugerMusic) June 27, 2023 Linux is probably the system that suffers the least from these issues as it still feels pretty snappy on modest hardware. Fedora Linux 38, released in April 2023, runs really well on a micro PC from 11 years ago--even if Gnome or KDE had been resource hogs back in the day. That said, this is only an illusion. As soon as you start installing any modern app that wasn't developed exclusively for Linux… the slow app start times and generally poor performance show up. Related, but I feel this needs saying: the biggest shock for me was when I joined Google back in 2009. At the time, Google Search and GMail had stellar performance: they were examples to follow. From the inside though… I was quite shocked by how all internal tools crawled, and in particular by how slow the in-house command line tools were. I actually fault Google for the situation we are in today due to their impressive internal systems and their relentless push for web apps at all costs, which brings us to… Causes How does this all happen? It's easy to say "Bloat!", but that's a hard thing to define because bloat can be justified: what one person considers as bloat is not the same as what another person considers as bloat. After all, "80% of users only use 20% of the software they consume" (see Pareto principle), but the key insight is that the 20% that each user consumes is different from one another. So bloat isn't necessarily in the features offered by the software; it's elsewhere. So then we have frameworks and layers of abstraction, which seem to introduce bloat for bloat's sake. But I'm not sure this is correct either: abstraction doesn't inherently have to make things slower, as Rust has proven. What makes things slower are priorities. Nobody prioritizes performance anymore unless for the critical cases where it matters (video games, transcoding video, and the like). What people (companies) prioritize is developer time. For example: you might not want to use Rust because its steep learning curve means you'll spend more time learning than delivering, or its higher compiler times mean that you'll spend more time waiting for the compiler than [DEL:shipping:DEL] debugging production. Or another example: you might not want to develop native apps because that means "duplicate work", so you reach out for a cross-platform web framework. That is, Electron. I know it's easy to dunk on Electron, but there are clear telltale signs that this platform is at fault for a lot of the damage done to desktop latency. Take 1Password's 8th version, which many of users that migrated from the 7th version despise due to the slowness of the new interface. Or take Spotify, which used to prioritize startup and playback latency over anything else in its inception and, as you know if you use it, that's not true any more: 2009 version: 20MB fully native cocoa app, launches in less than a second, instant feedback when you click stuff, playback usually starts within 50ms pic.twitter.com/Enzi40PDCX -- Rasmus Andersson (@rsms) May 10, 2023 These apps were rewritten in Electron to offer a unified experience across desktops and to cut down costs… but for whom? The cost cuts were for the companies owning the products, not for the users. Such cuts impose a tax on every one of us due to our day-to-day frustrations and the need to unnecessarily upgrade our hardware. Couple these rewrites with the fact that OSes cannot reuse the heavy framework across apps (same idea as how using all RAM as a cache is a flawed premise)… and the bloat quickly adds up when you run these apps concurrently. Leaving Electron aside, another decision that likely introduces latency is the mass adoption of managed and interpreted languages. I know these are easy to dunk on as well, but that's because we have reasons to do so. Take Java or .NET: several Windows apps have been slowly rewritten in C# and, while I have no proof of this, I'm convinced from past experience that this can be behind the sluggishness we notice. The JDK and the CLR do amazing jobs at optimizing long-running processes (their JIT can do PGO with real time data), but handling quick startup times is not something they manage well. This is why, for example, Bazel spawns a background server process to paper over startup latency and why Android has gone through multiple iterations of AOT compilation. (Edit: there must be other reasons though that I have not researched. As someone pointed out, my assumption that Windows Terminal was mostly C# is not true.) More on this in Wirth's law. One-off improvements eaten away To conclude, let me end with a pessimistic note by going back to hardware advancements. The particular improvement that SSDs brought us was a one-off transformation. HDDs kept getting faster for years indeed, but they never could deliver the kind of random I/O that desktops require to be snappy. The switch to SSDs brought a kind of improvement that was at a different level. Unfortunately… we could only buy those benefits once: there is no other technology to switch to that provides such a transformative experience. And thus, once the benefits brought by the new technology were eaten away by careless software, we are almost back to square one. Yes, SSDs are getting faster, but newer drives won't bring the kind of massive differences that the change from HDDs to SSDs brought. You can see this yourself if you try using recent versions of Windows or macOS without an SSD: it is nigh impossible. These systems now assume that computers have SSDs in them, which is a fair assumption, but a problematic one due to what I mentioned above. The same applies to "bloat" in apps: open up your favorite resource monitor, look for the disk I/O bandwidth graph, and launch any modern app. You'll see a stream of MBs upon MBs being loaded from disk into memory, all of which must complete before the app is responsive. This is the kind of bloat that Electron adds and that SSDs permitted, but that could be avoided altogether with different design decisions. Which makes me worried about Apple Silicon. Remember all the rage with the M1 launch and how these new machines had superb performance, extremely long battery life, and no fan noise? Well, wait and see: these benefits will be eaten away if we continue on the same careless path. And once that has happened, it'll be too late. Retrofitting performance into existing applications is very difficult technically, and almost impossible to prioritize organizationally. So… will computer architects be able to save us with other revolutionary technology shifts? I wouldn't want to rely on that. Not because the shifts might not exist, but because we shouldn't need them. --------------------------------------------------------------------- 1. Oh wait: remote work does not qualify. I'm sorry: if you did any kind of open source development in the 90s or 2000s, you know that fully-distributed and truly-async work was perfectly possible back then. -[?] 2. For a more detailed analysis, Dan Luu already covered this type of slowdown introduced by latency in his famous article "Computer latency: 1977-2017". Note how the article goes further back than 1999 and that the computer with the best latency he found is from 1983. But, yeah, that old computer cannot match the workloads we put our computers through these days, so I don't think comparing it to a modern desktop would be fair. -[?] << Previous All posts (0) (0) Share on Reddit Go to Hacker News Go to Twitter [pen] [ ] Your name (optional): [ ] Your email (optional): [ ] Invisible to all readers; if provided, you will receive notifications when replied to (not implemented yet) Post Comments are subject to moderation. This feature is experimental and is powered by EndTRACKER. If you experience any issues, please contact me off-band. Follow @jmmv on Mastodon Follow @jmmv on Twitter RSS feed [ ] Subscribe 0 subscribers [20181124-s] Julio Merino Software Engineer #FreeBSD, #Unix, #Rust, #Bazel, #blogger From Barcelona, living in Seattle Follow @jmmv on Mastodon Follow @jmmv on Twitter RSS feed [ ] Subscribe 0 subscribers Featured posts * Fast machines, slow machines * EndBASIC 0.10: Core language, evolved * Farewell, Microsoft; hello, Snowflake! * Rust is hard, yes, but does it matter? * Rust traits and dependency injection * A year on Windows: Introduction * Always be quitting * How does Google keep build times low? * How does Google avoid clean builds? * Unit-testing a console app (a text editor) * More... Archive * 2023 (7) + June 2023 (4) + March 2023 (2) + January 2023 (1) * 2022 (32) + December 2022 (1) + November 2022 (1) + October 2022 (1) + July 2022 (1) + June 2022 (2) + May 2022 (2) + April 2022 (4) + March 2022 (15) + February 2022 (4) + January 2022 (1) * 2021 (22) + November 2021 (2) + August 2021 (2) + July 2021 (5) + June 2021 (1) + April 2021 (2) + March 2021 (2) + February 2021 (3) + January 2021 (5) * 2020 (36) + December 2020 (4) + November 2020 (5) + October 2020 (5) + September 2020 (2) + August 2020 (6) + July 2020 (2) + June 2020 (2) + May 2020 (3) + April 2020 (2) + March 2020 (2) + February 2020 (1) + January 2020 (2) * 2019 (24) + December 2019 (8) + November 2019 (6) + October 2019 (1) + September 2019 (2) + March 2019 (2) + February 2019 (3) + January 2019 (2) * 2018 (25) + July 2018 (3) + June 2018 (7) + May 2018 (2) + April 2018 (2) + March 2018 (8) + February 2018 (3) * 2017 (6) + October 2017 (1) + August 2017 (1) + July 2017 (1) + February 2017 (3) * 2016 (8) + September 2016 (1) + May 2016 (1) + April 2016 (1) + March 2016 (2) + February 2016 (1) + January 2016 (2) * 2015 (17) + December 2015 (2) + October 2015 (2) + September 2015 (3) + June 2015 (2) + May 2015 (3) + April 2015 (1) + March 2015 (1) + February 2015 (3) * 2014 (12) + November 2014 (2) + May 2014 (3) + March 2014 (1) + February 2014 (3) + January 2014 (3) * 2013 (62) + December 2013 (7) + November 2013 (7) + October 2013 (7) + September 2013 (13) + August 2013 (9) + July 2013 (10) + June 2013 (9) * 2012 (29) + October 2012 (1) + September 2012 (1) + August 2012 (3) + July 2012 (2) + June 2012 (2) + May 2012 (3) + April 2012 (1) + March 2012 (1) + February 2012 (10) + January 2012 (5) * 2011 (60) + December 2011 (4) + November 2011 (4) + October 2011 (5) + September 2011 (11) + August 2011 (6) + July 2011 (4) + June 2011 (6) + May 2011 (6) + April 2011 (5) + March 2011 (2) + January 2011 (7) * 2010 (26) + December 2010 (7) + September 2010 (1) + July 2010 (1) + June 2010 (2) + May 2010 (5) + April 2010 (5) + March 2010 (3) + January 2010 (2) * 2009 (31) + October 2009 (1) + September 2009 (1) + August 2009 (3) + July 2009 (2) + June 2009 (4) + May 2009 (6) + April 2009 (2) + March 2009 (4) + January 2009 (8) * 2008 (61) + December 2008 (1) + November 2008 (4) + October 2008 (6) + September 2008 (1) + August 2008 (6) + July 2008 (14) + June 2008 (3) + May 2008 (1) + April 2008 (3) + March 2008 (3) + February 2008 (9) + January 2008 (10) * 2007 (86) + December 2007 (4) + November 2007 (7) + October 2007 (2) + September 2007 (8) + August 2007 (6) + July 2007 (15) + June 2007 (15) + May 2007 (4) + April 2007 (10) + March 2007 (8) + February 2007 (1) + January 2007 (6) * 2006 (103) + December 2006 (4) + November 2006 (3) + October 2006 (7) + September 2006 (6) + August 2006 (13) + July 2006 (4) + June 2006 (13) + May 2006 (7) + April 2006 (9) + March 2006 (6) + February 2006 (13) + January 2006 (18) * 2005 (129) + December 2005 (9) + November 2005 (7) + October 2005 (23) + September 2005 (10) + August 2005 (14) + July 2005 (5) + June 2005 (12) + May 2005 (6) + April 2005 (6) + March 2005 (13) + February 2005 (11) + January 2005 (13) * 2004 (84) + December 2004 (9) + November 2004 (6) + October 2004 (11) + September 2004 (19) + July 2004 (29) + June 2004 (10) Back to top Copyright 2004-2023 Julio Merino [stamp]