_______               __                   _______
       |   |   |.---.-..----.|  |--..-----..----. |    |  |.-----..--.--.--..-----.
       |       ||  _  ||  __||    < |  -__||   _| |       ||  -__||  |  |  ||__ --|
       |___|___||___._||____||__|__||_____||__|   |__|____||_____||________||_____|
                                                             on Gopher (inofficial)
 (HTM) Visit Hacker News on the Web
       
       
       COMMENT PAGE FOR:
 (HTM)   Helldivers 2 on-disk size 85% reduction
       
       
        tlonny wrote 9 hours 12 min ago:
        If this is somewhat common for games, could one create a virtual fs
        with FUSE that dedupes using via content-defined chunking and install
        games there?
        
        I feel like writes would probably be quite painful, but with game
        assets are essentially write-once read-forever so not the end of the
        world?
        
        As an aside, its messed up that people with expensive SSDs are
        unnecessarily paying this storage tax. Just feels lazy...
       
        forrestthewoods wrote 9 hours 22 min ago:
        Moral of the Story: don’t roll out a fix like this all at once. Do it
        over 6 months over several patches. Keep finding “new
        improvements”.
        
        Just don’t get caught at the end!
       
        roflchoppa wrote 10 hours 45 min ago:
        Do HDDs make up that much of the gaming market segment?
       
          aidenn0 wrote 48 min ago:
          1/9th of Helldivers 2 players, per TFA.
       
        sergiotapia wrote 10 hours 51 min ago:
        If this article was exciting for you, I also highly recommend this one.
        A random dude fixed a bug in GTA 5 that was the root cause of it
        loading insanely slowly since the game came out!
        
 (HTM)  [1]: https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times-b...
       
          oceansky wrote 6 hours 38 min ago:
          The write up of how Windows 11 24H2 broke GTA San Andreas was
          excellent.
          
 (HTM)    [1]: https://cookieplmonster.github.io/2025/04/23/gta-san-andreas...
       
        doener wrote 12 hours 55 min ago:
        [dupe]
        
 (HTM)  [1]: https://news.ycombinator.com/item?id=46134178
       
        ok_coo wrote 12 hours 57 min ago:
        I'm glad they've been able to do this, looks like a huge improvement
        for HD2 on PC.
        
        I've been on PS5 since launch and aside from Baldur's Gate 3, it's been
        the best game this gen IMO.
        
        The negativity I see towards the game (especially on Youtube) is weird.
        Some of the critiques seem legit but a lot of feels like rage bait,
        which appears to be a lot of YT videos around gaming lately.
        
        Anyway, a big improvement for a great game. Seems like less of an
        incentive now to uninstall if you only play now and then.
        
        iO
       
        mort96 wrote 14 hours 1 min ago:
        The negativity towards this is wild. A company followed relatively
        widely accepted industry practice (lots and lots of other games also
        have huge sizes on disk for the exact same reason), then eventually
        they decided to do their own independent testing to check whether said
        common practice actually makes things better or not in their case,
        found that it didn't, so they reversed it. In addition, they wrote up
        some nice technical articles on the topic, helping to change the old
        accepted industry wisdom.
        
        This seems great to me. Am I crazy? This feels like it should be Hacker
        News's bread and butter, articles about "we moved away from
        Kubernetes/microservices/node.js/serverless/React because we did our
        own investigation and found that the upsides aren't worth the
        downsides" tend to do really well here. How is this received so
        differently?
       
          nearbuy wrote 7 hours 42 min ago:
          This is a mischaracterization of the optimization. This isn't a
          standard optimization that games apply everywhere. It's an
          optimization for spinning disks that some games apply sometimes.
          They're expected to measure if the benefits are worth the cost. (To
          be clear, bundling assets is standard. Duplicating at this level is
          not.)
          
          This doesn't advance accepted industry wisdom because:
          
          1. The trade-off is very particular to the individual game. 
          Their loading was CPU-bound rather than IO-bound so the optimization
          didn't make much difference for HDDs. This is already industry
          wisdom. The amount of duplication was also very high in their game.
          
          2. This optimization was already on its way out as SSDs take over and
          none of the current gen consoles use HDDs.
          
          I'm not mad at Arrowhead or trying to paint them negatively. Every
          game has many bugs and mishaps like this. I appreciate the write-up.
       
          vict7 wrote 9 hours 11 min ago:
          Many players perceive Arrowhead as a pretty incompetent and
          untrustworthy developer. Helldivers has suffered numerous issues with
          both performance and balancing. The bugs constantly introduced into
          the game (not the fun kind you get to shoot with a gun) have eroded a
          lot of trust and good will towards the company and point towards a
          largely non-existent QA process.
          
          I won’t state my own personal views here, but for those that share
          the above perspective, there is little benefit of the doubt they’ll
          extend towards Arrowhead.
       
          Krasnol wrote 9 hours 34 min ago:
          I feel like negativity has become Hacker News's bread and butter.
       
          reactordev wrote 11 hours 57 min ago:
          The negativity comes from the zero effort they put into this prior to
          launch. Forcing people to download gigs of data that was unnecessary.
          
          Game studio's no longer care how big their games are if steam will
          still take them. This is a huge problem. GTA5 was notorious for
          loading json again, and again, and again during loading and it was
          just a mess. Same for HD2, game engines have the ability to only pack
          what is used but its still up to the developers to make sure their
          assets are reusable as to cut down on size.
          
          This is why Star Citizen has been in development for 15 years. They
          couldn't optimize early and were building models and assets like it's
          for film. Not low poly game assets but super high poly film assets.
          
          The anger here is real. The anger here is justified. I'm sick of
          having to download 100gb+ simply because a studio is too lazy and
          just packed up everything they made into a bundle.
       
            fyrabanks wrote 11 hours 13 min ago:
            There were 20 people working on this game when they started
            development. Total. I think they expanded to a little over 100.
            This isn't some huge game studio that has time to do optimization.
            
            GTA5 had well over 1000 people on its team.
       
              onli wrote 10 hours 19 min ago:
              Not sure GTA 5 is the right example to list here. Remember [1] .
              At least for a while they didn't optimize at all.
              
 (HTM)        [1]: https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-t...
       
              reactordev wrote 11 hours 0 min ago:
              Size of team has no bearing in this argument. Saying they were
              small so they get a pass at preventing obscene download sizes is
              like saying “Napster was created by one man, surely he
              shouldn’t be accountable” but he was.
              
              When making a game, once you have something playable, is to
              figure out how to package it. This is included in that effort.
              Determining which assets to compress, package, and ship.
              Sometimes this is done by the engine. Sometimes this is done by
              the art director.
       
                WheatMillington wrote 10 hours 49 min ago:
                Amount of resources absolutely has a bearing on how resources
                can be allocated.
       
                  reactordev wrote 8 hours 13 min ago:
                  This isn’t a resourcing issue. It’s a lack of knowledge
                  and skipped a step issue.
                  
                  When I did this. My small team took a whole sprint to make
                  sure that assets were packed. That tilemaps were made. That
                  audio files were present and we did an audit to make sure
                  nothing extra was packaged on disk. Today, because of digital
                  stores and just releasing zip files, no one cares what they
                  ship and often you can see it if you investigate the files of
                  any Unity or Unreal engine game. Just throw it all over the
                  fence.
       
            bluedino wrote 11 hours 44 min ago:
            > They couldn't optimize early and were building models and assets
            like it's for film. Not low poly game assets but super high poly
            film assets.
            
            Reminds me of the Crack.com interview with Jonathan Clark:
            
            Adding to the difficulty of the task, our artist had no experience
            in the field. I remember in a particular level we wanted to have a
            dungeon. A certain artist begin by creating a single brick, then
            duplicating it several thousand times and building a wall out of
            the bricks. He kept complaining that his machine was too slow when
            he tried to render it. Needless to say this is not the best way to
            model a brick wall.
            
 (HTM)      [1]: https://web.archive.org/web/20160125143707/http://www.loon...
       
              reactordev wrote 11 hours 31 min ago:
              this is very very common as there's only a handful of school that
              teach this. Displacement mapping with a single poly is the
              answer. Game dev focused schools have this but any other visual
              media school it's "build a brick, array the brick 10,000 times".
       
          somat wrote 12 hours 13 min ago:
          At one point, I think it was TitanFall2, the pc port of a game
          deliberately converted it's audio to uncompressed wav files in order
          to inflate the install size, They said it was for performance but the
          theory was to make it more inconvenient for pirates to distribute.
          
          When the details of exactly why the game was so large came out, many
          people felt this was a sort of customer betrayal, The publisher was
          burning a large part of the volume of your precious high speed sdd
          for a feature that added nothing to the game.
          
          People probably feel the same about this, why were they so
          disrespectful of our space and bandwidth in the first place? But I
          agree it is very nice that they wrote up the details in this
          instance.
       
            ycombinatrix wrote 9 hours 6 min ago:
            Wasn't that Titanfall 1? I remember Titanfall 2 having a much
            smaller installation size.
       
            snet0 wrote 9 hours 15 min ago:
            This is conspiratorial nonsense.
       
            maccard wrote 9 hours 23 min ago:
            > They said it was for performance but the theory was to make it
            more inconvenient for pirates to distribute.
            
            This doesn't even pass the sniff test. The files would just be
            compressed for distribution and decompressed on download. Pirated
            games are well known for having "custom" installers.
       
              ycombinatrix wrote 8 hours 48 min ago:
              >The files would just be compressed for distribution and
              decompressed on download
              
              All Steam downloads are automatically compressed. It's also
              irrelevant. The point is that playback of uncompressed audio is
              indeed cheaper than playback of compressed audio.
       
                justsomehnguy wrote 4 hours 52 min ago:
                > The point is that playback of uncompressed audio
                
                Bullshit. This is not a problem since 2003.
                
                And nobody forbids you to actually decompress your compressed
                audio when you are loading the assets from the disk.
       
                duskwuff wrote 8 hours 38 min ago:
                > The point is that playback of uncompressed audio is indeed
                cheaper than playback of compressed audio.
                
                Even when Titanfall 2 was released in 2016, I don't think that
                was meaningfully the case. Audio compression formats have been
                tuned heavily for efficient playback.
       
                  ycombinatrix wrote 6 hours 5 min ago:
                  I think GP was confused - Titanfall 1 from 2014 is the one
                  with the massive volume of uncompressed audio. Though I think
                  your point still stands.
                  
                  I was trying to point out that the decision to compress or
                  not compress audio likely has nothing to do with the download
                  size.
       
                  nearbuy wrote 7 hours 27 min ago:
                  Uncompressed audio is typically used for sound effects, while
                  music is compressed. Latency is the primary benefit.
                  Uncompressed audio will play immediately while an mp3 will
                  have a few frames delay. Sounds like gunshots or footsteps
                  are typically short files anyway, so the increased memory
                  usage isn't that painful.
                  
                  Games also can stack many sounds, so even if the decoding
                  cost is negligible when playing a single sound, it'll be
                  greater if you have 32 sounds playing at once.
       
                    duskwuff wrote 6 hours 42 min ago:
                    > Uncompressed audio will play immediately while an mp3
                    will have a few frames delay.
                    
                    I'm not sure what you mean by this. Encoding latency is
                    only relevant when you're dealing with live audio streams -
                    there's no delay inherent to playing back a recorded sound.
                    
                    > Sounds like gunshots or footsteps are typically short
                    files anyway, so the increased memory usage isn't that
                    painful.
                    
                    Not all sound effects are short (consider e.g. loops for
                    ambient noise!), and the aggregate file size for
                    uncompressed audio can be substantial across an entire
                    game.
       
                      nearbuy wrote 6 hours 8 min ago:
                      > there's no delay inherent to playing back a recorded
                      sound.
                      
                      There absolutely is. You can decompress compressed audio
                      files when loading so they play immediately, but if you
                      want to keep your mp3 compressed, you get a delay. Games
                      keep the sound effects in memory uncompressed.
                      
                      > Not all sound effects are short
                      
                      Long ambient background noises often aren't latency
                      sensitive and can be streamed. For most games textures
                      are the biggest usage of space and audio isn't that
                      significant, but every game is different. I'm just
                      telling you why we use uncompressed audio. If there is a
                      particular game you know of that's wasting a lot of space
                      on large audio files, you should notify the devs.
                      
                      There is a reason both Unity and Unreal use uncompressed
                      audio or ADPCM for sound effects.
       
                        justsomehnguy wrote 4 hours 50 min ago:
                        > but if you want to keep your mp3 compressed, you get
                        a delay
                        
                        If that really bothers you then write your own on-disk
                        compression format.
                        
                        > why we use uncompressed audio
                        
                        > ADPCM
                        
                        ... which is a compressed and lossy format.
       
                          danbolt wrote 39 min ago:
                          Within the scope of a game’s production, the
                          programmer time spent dogfooding the new audio format
                          can be used towards something else that improves the
                          value of the end product.
                          
                          The uncompressed audio for latency-sensitive
                          one-shots usually isn’t taking up the bulk of
                          memory either.
       
                          nearbuy wrote 3 hours 41 min ago:
                          > If that really bothers you then write your own
                          on-disk compression format.
                          
                          Why? What are you trying to solve here? You're going
                          to have a hard time making a new format that serves
                          you better than any of the existing formats.
                          
                          The most common solution for instant playback is just
                          to store the sound uncompressed in memory. It's not a
                          problem that needs solving for most games.
                          
                          ADPCM and PCM are both pretty common. ADPCM for audio
                          is kinda like DXT compression for textures: a very
                          simple compression that produces files many times
                          larger than mp3, and doesn't have good sound quality,
                          but has the advantage that playback and seek costs
                          virtually nothing over regular PCM. The file sizes of
                          ADPCM are closer to PCM than mp3. I should have been
                          clearer in my first comment that the delay is only
                          for mp3/Vorbis and not for PCM/ADPCM.
                          
                          There isn't a clean distinction between compressed
                          and uncompressed and lossy/lossless in an absolute
                          sense. Compression is implicitly (or explicitly)
                          against some arbitrary choice of baseline. We
                          normally call 16-bit PCM uncompressed and lossless
                          but if your baseline is 32-bit floats, then it's
                          lossy and compressed from that baseline.
       
                            justsomehnguy wrote 2 hours 22 min ago:
                            > Why? What are you trying to solve here? You're
                            going to have a hard time making a new format that
                            serves you better than any of the existing formats.
                            
                            Storage space. But this is the way for the same
                            guys who duplicate 20Gb seven times 'to serve
                            better by the industry standard'.
                            
                            More sane people would just pack that AD/PCM in a
                            .pk3^W sorry in a .zip file (or any other packaging
                            format with LZ/7z/whatever compatible compression
                            method) with the fastest profile and would have the
                            best of the both worlds: sane storage requirements,
                            uncompressed in memory. As a bonus it would be
                            loaded faster from HDD because a data chunk which
                            is 10 times smaller than uncompressed one would be
                            loaded surprise 10 times faster.
       
            recursive wrote 10 hours 0 min ago:
            I remember seeing warez game releases in the late 90s that had
            custom packaging to de-compress sound effects that were stored
            uncompressed in the original installer.
            
            It seems no one takes pride in their piracy anymore.
       
            ryandrake wrote 12 hours 4 min ago:
            > When the details of exactly why the game was so large came out,
            many people felt this was a sort of customer betrayal, The
            publisher was burning a large part of the volume of your precious
            high speed sdd for a feature that added nothing to the game.
            
            Software developers of all kinds (not just game publishers) have a
            long and rich history of treating their users' compute resources as
            expendable. "Oh, users can just get more memory, it's cheap!" "Oh,
            xxxGB is such a small hard drive these days, users can get a bigger
            one!" "Oh, most users have Pentiums by now, we can drop 486
            support!" Over and over we've seen companies choose to throw their
            users under the bus so that they can cheap out on optimizing their
            product.
       
              mghackerlady wrote 9 hours 22 min ago:
              Maybe that'll start to change since ram is the new gold and who
              knows what the AI bubble will eat next
       
          zamadatix wrote 12 hours 23 min ago:
          Arrowhead probably deserves more love for breaking the norm but I
          think it's overshadowed by people finding out for the first time the
          reason HDDs are so common in gaming setups is companies have been
          blindly shaving a few seconds off HDD load time off at the cost of 7x
          the disk space.
          
          If it had been more well known this was the cause of game bloat
          before then this probably would have been better received. Still,
          Arrowhead deserves more credit both for testing and breaking the norm
          as well as making it a popular topic.
       
            nopurpose wrote 10 hours 13 min ago:
            My immediate question is that if all of that was on-disk data
            duplication, why did it affected download size?  Can't small
            download be expanded into optimal layout on the client side?
       
              ender341341 wrote 7 hours 17 min ago:
              depending on how the data duplication is actually done (like
              texture atlasing the actual bits can be very different after
              image compression) it can be much harder to do rote bit level
              deduplication. They could potentially ship the code to generate
              all of those locally, but then they have to deal with a lot of
              extra rights/contracts to do so (proprietary codecs/tooling is
              super, super common in gamedev), and
              
              Also largely cause devs/publishers honestly just don't really
              think about it, they've been doing it as long as optical media
              has been prevalent (early/mid 90s) and for the last few years
              devs have actually been taking a look and realizing it doesn't
              make as much sense as it used to, especially if like in this case
              the majority of the time is spent on runtime generation of, or if
              they require a 2080 as minimum specs whats the point of
              optimizing for 1 low end component if most people running it are
              on high end systems.
              
              Hitman recently (4 years ago) did a similar massive file shrink
              and mentioned many of the same things.
       
              braiamp wrote 9 hours 16 min ago:
              It didn't. They downloaded 43 GB instead of 152 GB, according to
              SteamDB: [1] Now it is 20 GB => 21 GB. Steam is pretty good at
              deduplicating data in transit from their servers. They are not
              idiots that will let developers/publishers eat their downstream
              connection with duplicated data.
              
 (HTM)        [1]: https://steamdb.info/app/553850/depots/
 (HTM)        [2]: https://partner.steamgames.com/doc/sdk/uploading#AppStru...
       
                myself248 wrote 6 hours 47 min ago:
                Furthermore, this raises the possibility of a "de-debloater"
                that HDD users could run, which would duplicate the data into
                its loading-optimized form, if they decided they wanted to
                spend the space on it. (And a "de-de-debloater" to recover the
                space when they're not actively playing the game...)
                
                The whole industry could benefit from this.
       
                  nomel wrote 2 hours 52 min ago:
                  > to recover the space when they're not actively playing the
                  game
                  
                  This would defeat the purpose. The goal of the duplication is
                  to place the related data physically close, on the disk. Hard
                  links, removing then replacing, etc, wouldn't preserve the
                  physical spacing of the data, meaning the terrible slow read
                  head has to physically sweep around more.
                  
                  I think the sane approach would be to have a HDD/SDD switch
                  for the file lookups, with all the references pointing to the
                  same file, for SDD.
       
              ahartmetz wrote 9 hours 56 min ago:
              Sure it can - it would need either special pre- and
              postprocessing or lrzip ("long range zip") to do it
              automatically. lrzip should be better known, it often finds
              significant redundancy in huge archives like VM images.
       
            abtinf wrote 12 hours 0 min ago:
            Part of what makes this outrageous is that the install size itself
            is probably a significant part of the reason to install the game on
            an HDD.
            
            154GB vs 23GB can trivially make the difference of whether the game
            can be installed on a nice NVMe drive.
            
            Is there a name for the solution to a problem (make size big to
            help when installed on HDD) in fact being the cause of the problem
            (game installed on HDD because big) in the first place?
       
              jayd16 wrote 9 hours 3 min ago:
              "Self fulfilling prophecy" perhaps?
       
              consp wrote 9 hours 26 min ago:
              Can any games these days be reliably ran on hdd's with max
              200mb/s throughout (at best)? Or does everyone get a coffee and
              some cookies when a new zone loads? Even with this reduction that
              will take a while.
              
              I thought all required ssd's now for "normal" gameplay.
       
                kbolino wrote 9 hours 4 min ago:
                Until you get to super-high-res textures and the like, the
                throughput isn't nearly as important as the latency.
                
                At 200 MB/s the way hard drives usually measure it, you're able
                to read up to 390,625 512-byte blocks in 1 second, or to put it
                another way, a block that's immediately available under the
                head can be read in 2.56 microseconds. On the other hand, at
                7200 RPM, it takes up to 8.33 milliseconds to wait for the
                platter to spin around and reach a random block on the same
                track. Even if these were the only constraints, sequentially
                arranging data you know you'll need to have available at the
                same time cuts latency by a factor of about 3000.
                
                It's much harder to find precise information about the speed of
                the head arm, but it also usually takes several milliseconds to
                move from the innermost track to the outermost track or vice
                versa. In the worst case, this would double the random seek
                time, since the platter has to spin around again because the
                head wasn't in position yet. Also, since hard drives are so
                large nowadays, the file system allocators actually tend to
                avoid fragmentation upfront, leading to generally having few
                fragments for large files (YMMV).
                
                So, the latency on a hard drive can be tolerable when optimized
                for.
       
                  wtallis wrote 8 hours 46 min ago:
                  > On the other hand, at 7200 RPM, it takes up to 138
                  microseconds to wait for the platter to spin around and reach
                  a random block on the same track.
                  
                  You did the math for 7200 rotations per second, not 7200
                  rotations per minute = 120 rotations per second.
                  
                  In gaming terms, you get at most one or two disk reads per
                  frame, which effectively means everything has to be carefully
                  prefetched well in advance of being needed. Whereas on a
                  decade-old SATA SSD you get at least dozens of random reads
                  per frame.
       
                    kbolino wrote 8 hours 43 min ago:
                    Fixed!
       
          eurekin wrote 12 hours 49 min ago:
          The negativity wasn't created in a vacuum. ArrowHead has a long track
          record of technical mishaps and a proven history of erasing all
          evidence about those issues, without ever trying to acknowledge them.
          Reddits, Discord and YouTube comment section are heavily moderated. I
          suspect there's might be a 3rd party involved in this, which doesn't
          forward any technical issues, if the complaint involves any sign of
          frustration. Even the relation with their so called "Propaganda
          Commanders" (official moniker for their youtube partner channels) has
          been significantly strained in two cases, for trivialities.
          
          It took Sony's intervention to actually pull back the game into
          playable state once - resulting in the so called 60 day patch.
          
          Somehow random modders were able to fix some of the most egregiously
          ignored issues (like an enemy type making no sound) quickly and
          effectively. ArrowHead ignored, then denied, then used the "gamers
          bad" tactic, banned people pointing it out. After long time, finally
          fixing it and trying to bury it in the patch notes too.
          
          They also have been caught straight up lying about changes, most
          recent one was: "Apparently we didn't touch the Coyote", where they
          simply buffed enemies resistance to fire, effectively nerfing the
          gun.
       
            sigmoid10 wrote 12 hours 40 min ago:
            Sony nearly killed all good will the game had accrued when they
            tried to use the massive player base as an opportunity to force
            people into their worthless ecosystem. I don't think Sony even has
            the capability to make good technical decisions here, they are just
            the publisher. It was always Arrowhead trying to keep up with their
            massive success that they clearly weren't prepared for at all. In
            the beginning they simply listened to some very vocal players'
            complaints, which turned out to not be what the majority actually
            wanted. Player driven development is hardly ever good for a game.
       
              eurekin wrote 12 hours 22 min ago:
              So, players wanting:
              
              - To their PC not reboot and BSOD (was a case few months ago)
              
              - Be able to actually finish a mission (game still crashes a lot
              just after extraction, it's still rare for the full team to
              survive 3 missions in a row)
              
              - Be able to use weapon customisation (the game crashed, when you
              navigated to the page with custom paints)
              
              - Continue to run, even when anybody else from the team was
              stimming (yes, any person in the team stimming caused others to
              slow down)
              
              - Actually be able to hear one of the biggest enemies in the game
              
              - To not issue stim/reload/weapon change multiple times, for them
              just to work (it's still normal to press stim 6 times in some
              cases, before it activates, without any real reason)
              
              - Be able to use chat, when in the vehicle (this would result in
              using your primary weapon)
              
              - Be able to finish drill type mission (this bugs out a lot
              still)
              
              - Not be attacked by enemies that faze through buildings
              
              - Not be attacked by bullets passing through terrain, despite the
              player bullets being stopped there
              
              are just vocal player's complaints? A lot of those bugs went
              totally unaddressed for months. Some keep coming back in
              regressions. Some just are still ongoing. This is only a short
              list of things I came across, while casually playing. It's a rare
              sight to have a full OP without an issue (even mission hardlocks
              still).
              
              About Sony - I specifically referred the Shams Jorjani's (CEO of
              ArrowHead) explanation to Hermen Hulst (the head of PlayStation
              Studios) why the review score collapsed to 19%, among other
              issues.
       
                FieryMechanic wrote 11 hours 17 min ago:
                As someone with 700 hours in the game, I've played the game
                both on Windows and Linux.
                
                A lot of issues are to do with the fact that the game seems to
                corrupt itself. If I have issues (usually performance related),
                I do a steam integrity check and I have zero issues afterwards.
                BTW, I've had to do this on several games now, so this isn't
                something that is unique to HellDivers. My hardware is good
                BTW, I check in various utils and the drives are "ok" as far as
                I can tell.
                
                > - To their PC not reboot and BSOD (was a case few months ago)
                
                This was hyped up by a few big YouTubers. The BSODs was because
                their PCs were broken. One literally had a burn mark on their
                processor (a known issue with some boards/processor combos) and
                the BSODs went away when they replaced their processor. This
                tells me that there was something wrong with their PC and any
                game would have caused a BSOD.
                
                So I am extremely sceptical of any claims of BSODs because of a
                game. What almost is always the case is that the OS or the
                hardware is at issue and playing a game will trigger the issue.
                
                If you are experiencing BSODs I would make sure your hardware
                and OS are actually good, because they are probably not. BTW I
                haven't a BSOD in Windows for about a decade because I don't
                buy crap hardware.
                
                > - Be able to actually finish a mission (game still crashes a
                lot just after extraction, it's still rare for the full team to
                survive 3 missions in a row)
                
                False. A few months ago I played it for an entire day and the
                game was fine. Last week I played it a good portion of Saturday
                night. I'm in several large HellDivers focused Discord servers
                and I've not heard a lot of people complaining about it. Maybe
                6 months ago or a year ago this was the case, but not now.
                
                > Be able to use weapon customisation (the game crashed, when
                you navigated to the page with custom paints)
                
                This happened for like about a week for some people and I
                personally didn't experience this.
                
                > To not issue stim/reload/weapon change multiple times, for
                them just to work (it's still normal to press stim 6 times in
                some cases, before it activates, without any real reason)
                
                I've not experience this. Not heard anyone complain about this
                and I am in like 4 different HellDivers focus'd discord servers
                
                > Not be attacked by enemies that faze through buildings
                
                This can be annoying, but it happens like once in a while. It
                isn't the end of the world.
       
                  XorNot wrote 8 hours 41 min ago:
                  It's basically an Internet fable at this point that there's
                  "a game that physically damages your hardware".
                  
                  The answer to every such claim is just: no. But it's click
                  bait gold to the brain damage outrage YouTuber brigade.
                  
                  Accidentally using a ton of resources might e reveal
                  weaknesses, but it is absolutely not any software vendors
                  problem that 100% load might reveal your thermal paste
                  application sucked or Nvidia is skimping on cable load
                  balancing.
       
                    eurekin wrote 8 hours 20 min ago:
                    Trust me, I'm a software developer with more than two
                    decades of experience. Have been dabbling in hardware since
                    the Amiga 500 era. "I have that specific set of skills"
                    that allows me to narrow down a class of issues pretty well
                    - just a lot of component switching in a binary divide and
                    conquer fashion across hardware.
                    
                    The issue is 1) actually exaggarated in the community, but
                    not without actual substance 2) getting disregarded exactly
                    because of exaggarations. It was a very real thing.
                    
                    I also happen to have a multi gpu workstation that works
                    flawlessly too
       
                    FieryMechanic wrote 8 hours 24 min ago:
                    This was pretty much my take as well. I have an older CPU,
                    Motherboard and GPU combo before the newer GPU power cables
                    that obviously weren't tested properly and I have no
                    problems with stability.
                    
                    These guys are running an intensive game on the highest
                    difficulty, while streaming and they probably have a bunch
                    of browser windows and other software running background.
                    Any weakness in the system is going to be revealed.
                    
                    I had performance issues during that time and I had to
                    restart game every 5 matches. But it takes like a minute to
                    restart the game.
       
                  eurekin wrote 10 hours 17 min ago:
                  > > - Be able to actually finish a mission (game still
                  crashes a lot just after extraction, it's still rare for the
                  full team to survive 3 missions in a row)
                  
                  > False. A few months ago I played it for an entire day and
                  the game was fine. Last week I played it a good portion of
                  Saturday night. I'm in several large HellDivers focused
                  Discord servers and I've not heard a lot of people
                  complaining about it. Maybe 6 months ago or a year ago this
                  was the case, but not now.
                  
                  I specifically mean the exact time, right after the pelican
                  starts to fly. I keep seeing " left" or "disconnected". Some
                  come back and I have a habit of asking: "Crash?", they
                  respond with "yeah"
       
                    FieryMechanic wrote 8 hours 28 min ago:
                    If that is happening, they need to do a Steam Integrity
                    check. I understand the game is buggy, but it isn't that
                    buggy.
       
                  eurekin wrote 10 hours 53 min ago:
                  > - To their PC not reboot and BSOD (was a case few months
                  ago)
                  
                  I was just about to replace my gpu (4090 at that!), I had
                  them 3 times a session. I did sink a lot of hours to debug
                  that (replaced cables, switched PSUs between desktops) and
                  just gave up. After few weeks, lo and behold, a patch comes
                  out and it all disappears.
                  
                  A lot of people just repeat hearsay about the game
       
                  gfaster wrote 10 hours 59 min ago:
                  > So I am extremely sceptical of any claims of BSODs because
                  of a game.
                  
                  Generally speaking, I am too. That is unless there is
                  kernel-level anticheat. In that case I believe it's fair to
                  disregard all other epistemological processes and blame BSODs
                  on the game out of principle
       
                    FieryMechanic wrote 8 hours 30 min ago:
                    > In that case I believe it's fair to disregard all other
                    epistemological processes and blame BSODs on the game out
                    of principle
                    
                    I am sorry but that is asinine and unscientific. You should
                    blame BSODs on what is causing them. I don't like kernel
                    anti-cheat but I will blame the actual cause of the issues,
                    not assign blame on things which I don't approve of.
                    
                    I am a long time Linux user and many of the people
                    complaining about BSODs on Windows had a broken the OS in
                    one way or another. Some were running weird stuff like 3rd
                    party shell extensions that modify core DLLs, or they had
                    installed every POS shovelware/shareware crap. That isn't
                    Microsoft's fault if you start running an unsupported
                    configuration of the OS.
                    
                    Similarly. The YouTubers that were most vocal about
                    HellDivers problems did basically no proper investigation
                    other than saying "look it crashed", when it was quite
                    clearly their broken hardware that was the issue. As
                    previously stated their CPU had a burn mark on one of the
                    pins, some AM5 had faults that caused this IIRC.  So
                    everything indicated hardware failure being the cause of
                    the BSOD. They still blamed the game, probably because it
                    got them more watch time.
                    
                    During the same time period when people were complaining
                    about BSODs, I didn't experience one. I was running the
                    same build of the game as them and playing on the same
                    difficulty and sometimes recording it via OBS (just like
                    they were). What I didn't have was a AM5 motherboard, I
                    have and older AM4 motherboard which doesn't have these
                    problems.
       
                      gfaster wrote 7 hours 36 min ago:
                      > that is asinine and unscientific
                      
                      Well, yes. I did say something to that effect. Blaming
                      BSODs on invasive anti-cheat out of principle is a
                      political position, not a scientific one.
                      
                      > During the same time period when people were
                      complaining about BSODs, I didn't experience one. I was
                      running the same build of the game as them and playing on
                      the same difficulty and sometimes recording it via OBS
                      (just like they were). What I didn't have was a AM5
                      motherboard, I have and older AM4 motherboard which
                      doesn't have these problems.
                      
                      I understand what you're saying here, but anyone who does
                      a substantial amount of systems programming could tell
                      you that hardware-dependent behavior is evidence for a
                      hardware problem, but does not necessarily rule out a
                      software bug that only manifests on certain hardware. For
                      example, newer hardware could expose a data race because
                      one path is much faster. Alternatively, a subroutine
                      implemented with new instructions could be incorrect.
                      
                      Regardless, I don't doubt that this issue with Helldivers
                      2 was caused by (or at least surfaced by) certain
                      hardware, but that does not change that given such an
                      issue, I would presume the culprit is kernel anticheat
                      until presented strong evidence to the contrary.
       
                        FieryMechanic wrote 6 hours 46 min ago:
                        > Well, yes. I did say something to that effect.
                        Blaming BSODs on invasive anti-cheat out of principle
                        is a political position, not a scientific one.
                        
                        When there are actual valid concerns about the
                        anti-cheat, these will be ignored because of people
                        that assigned blame to it when unwarranted. This is why
                        making statements based on your ideology can be
                        problematic.
                        
                        > I understand what you're saying here, but anyone who
                        does a substantial amount of systems programming could
                        tell you that hardware-dependent behavior is evidence
                        for a hardware problem, but does not necessarily rule
                        out a software bug that only manifests on certain
                        hardware. For example, newer hardware could expose a
                        data race because one path is much faster.
                        Alternatively, a subroutine implemented with new
                        instructions could be incorrect.
                        
                        People were claiming it was causing hardware damage
                        which is extremely unlikely since both Intel, AMD and
                        most hardware manufacturers have mechanisms which
                        prevent this. This isn't some sort of opaque race
                        condition.
                        
                        > RI would presume the culprit is kernel anti-cheat
                        until presented strong evidence to the contrary.
                        
                        You should know that if you making assumptions without
                        evidence that will often lead you astray.
                        
                        I don't like kernel anti-cheat and would prefer for it
                        not to exist, but making stupid statements based on
                        ideology instead of evidence just makes you look silly.
       
                    eurekin wrote 10 hours 50 min ago:
                    I had them and I keep observing this strange tendency to
                    wipe that particular issue out of existence
       
          Night_Thastus wrote 13 hours 4 min ago:
          It would be one thing if it was a 20% increase in space usage, or if
          the whole game was smaller to start with, or if they had actually
          checked to see how much it assisted HDD users.
          
          But over 6x the size with so little benefit for such a small segment
          of the players is very frustrating. Why wasn't this caught earlier?
          Why didn't anyone test? Why didn't anyone weigh the pros and cons?
          
          It's kind of exemplary of HD2's technical state in general - which is
          a mix of poor performance and bugs. There was a period where almost
          every other mission became impossible to complete because it was
          bugged.
          
          The negativity is frustration boiling over from years of a bad
          technical state for the game.
          
          I do appreciate them making the right choice now though, of course.
       
            teamonkey wrote 10 hours 35 min ago:
            It was a choice, not an oversight. They actively optimised for HDD
            users, because they believed that failing to do so could impact
            load times for both SSD and HDD users. There was no speed penalty
            in doing so for SSD users, just a disk usage penalty.
            
            Helldivers II was also much smaller at launch than it is now. It
            was almost certainly a good choice at launch.
       
              mort96 wrote 10 hours 30 min ago:
              You make a million decisions in the beginning of every project.
              I'm certain they made the choice to do this "optimization" at an
              early point (or even incidentally copied the choice over from an
              earlier project) at a stage where the disk footprint was small (a
              game being 7GB when it could've been 1GB doesn't exactly set off
              alarm bells).
              
              Then they just didn't reconsider the choice until, well, now.
       
                teamonkey wrote 9 hours 1 min ago:
                Even at the end of development it’s a sensible choice. It’s
                the default strategy for catering to machines with slow disk
                access. The risk of some players experiencing slow load times
                is catastrophic at launch. In absence of solid user data,
                it’s a fine assumption to make.
       
                  brokenmachine wrote 2 hours 15 min ago:
                  Call me a dinosaur, but I don't consider a 154Gb download
                  before I can start playing a good first impression.
                  
                  In fact, I would seriously consider even buying a game that
                  big if I knew beforehand. When a 500Gb SSD is $120 Aussie
                  bucks, that's $37 of storage.
       
                  XorNot wrote 8 hours 46 min ago:
                  The first impression matters is the thing. This was John
                  Carmacks idea on how to sell interlacing to smartphone
                  display makers for VR: the upsell he had was that there's one
                  very important moment when a consumer sees a new phone: they
                  pick it up, open something and flick it and that scroll
                  effect better be a silky smooth 60 FPS or more or there's
                  trouble. (His argument was making that better would be a side
                  effect of what he really wanted).
       
            colechristensen wrote 11 hours 55 min ago:
            >But over 6x the size with so little benefit for such a small
            segment of the players is very frustrating. Why wasn't this caught
            earlier? Why didn't anyone test? Why didn't anyone weigh the pros
            and cons?
            
            Have you never worked in an organization that made software?
            
            Damn near everything can be 10x as fast and using 1/10th the
            resources if someone bothered to take the time to find the
            optimizations.    RARE is it that something is even in the same order
            of magnitude as its optimum implementation.
       
              thaumasiotes wrote 10 hours 9 min ago:
              But this isn't an optimization. The 150+GB size is the
              "optimization", one that never actually helped with anything. The
              whole news here is "Helldivers 2 stopped intentionally screwing
              its customers".
              
              I don't see why it's a surprise that people react "negatively",
              in the sense of being mad that (a) Helldivers 2 was intentionally
              screwing the customers before, and (b) everyone else is still
              doing it.
       
                bigstrat2003 wrote 6 hours 52 min ago:
                > The whole news here is "Helldivers 2 stopped intentionally
                screwing its customers".
                
                That is an extremely disingenuous way to frame the issue.
       
                  thaumasiotes wrote 6 hours 16 min ago:
                  How so?
       
              zamadatix wrote 10 hours 50 min ago:
              I think what makes this a bit different from the usual
              "time/value tradeoff" discussion is bloating the size by 6x-7x
              was the result of unnecessary work in the name of optimization
              instead of lack of cycles to spend on optimization.
       
                mort96 wrote 8 hours 46 min ago:
                Eh probably not, it's probably handled by some automated system
                when making release builds of the game. Sure, implementing that
                initially was probably some work (or maybe it was just checking
                a checkbox in some tool), but there's probably not much manual
                work involved anymore to keep it going.
                
                Reverting it now though, when the game is out there on a
                million systems, requires significant investigation to ensure
                they're not making things significantly worse for anyone, plus
                a lot of testing to make sure it doesn't outright break stuff.
       
                  zamadatix wrote 2 hours 55 min ago:
                  Reverting it now was certainly a pile of work, but that's
                  neither here nor there for the portion of the story bothering
                  people. It's like they threw rocks threw the windows years
                  ago to make them slightly clearer to see through and now put
                  a ton of work in to undo that because they discovered that
                  made no sense in reality.
                  
                  It's great they did all the work to fix it after the fact,
                  but that doesn't justify why it was worth throwing rocks
                  through the window in the first place (which is different
                  than not doing optimizations).
       
              ozgrakkurt wrote 11 hours 31 min ago:
              This is not a reason for accepting it imo
       
                mywittyname wrote 11 hours 20 min ago:
                Optimization takes up time, and often it takes up the time of
                an expert.
                
                Given that, people need to accept higher costs, longer
                development times, or reduced scope if they want better
                optimized games.
                
                But what is worse, is just trying to optimize software is not
                the same as successfully optimizing it.  So time and money
                spent on optimization might yield no results because there
                might not be anymore efficiency to be gained, the person doing
                the work lacks the technical skill, the gains are part of a
                tradeoff that cannot be justified, or the person doing the work
                can't make a change (i.e., a 3rd party library is the problem).
                
                The lack of technical skill is a big one, IMO. I'm personally
                terrible at optimizing code, but I'm pretty good at building
                functional software in a short amount of time.    We have a
                person on our team who is really good at it and sometimes he'll
                come in after me to optimize work that I've done.  But he'll
                spend several multiples of the time I took making it work and
                hammering out edge cases.  Sometimes the savings is worth it.
       
                  kappaking wrote 10 hours 27 min ago:
                  > Given that, people need to accept higher costs, longer
                  development times, or reduced scope if they want better
                  optimized games.
                  
                  God why can’t it just be longer development time. I’m
                  sick of the premature fetuses of games.
       
                    unusualmonkey wrote 4 hours 16 min ago:
                    Just wait until after launch. You get a refined experience
                    and often much lower prices.
       
                    maccard wrote 9 hours 25 min ago:
                    > God why can’t it just be longer development time.
                    
                    Where do you stop? What do the 5 tech designers do while
                    the 2 engine programmers optimise every last byte of
                    network traffic?
                    
                    > I’m sick of the premature fetuses of games.
                    
                    Come on, keep this sort of crap off here. Games being janky
                    isn't new -  look at old console games and they're
                    basically duct taped together. Go back to Half-life 1 in
                    1998 - the Xen world is complete and utter trash. Go back
                    farther and you have stuff that's literally unplayable [0],
                    or things that were so bad they literally destroyed an
                    entire industry [1], or rendered the game uncompleteable
                    [2].
                    
                    [0] [1] [2]
                    
 (HTM)              [1]: https://en.wikipedia.org/wiki/Dr._Jekyll_and_Mr._H...
 (HTM)              [2]: https://www.theguardian.com/film/2015/jan/30/a-gol...
 (HTM)              [3]: https://www.reddit.com/r/gamecollecting/comments/h...
       
                      colechristensen wrote 8 hours 7 min ago:
                      Super Mario 64, widely recognized as one of the most
                      iconic influential games ever... was released with a
                      build that didn't have the compiler optimizations turned
                      on.  They proved this by decompiling it and with the
                      exact right compiler and tools recompiling it with the
                      non-optimized arguments.  Recompiling with the
                      optimizations turned on resulted in no problems and
                      significant performance boosts.
                      
                      One of the highest rated games ever released without devs
                      turning on the "make it faster" button which would have
                      required approximately zero effort and had zero
                      downsides.
                      
                      This kind of stuff happens because the end result A vs. B
                      doesn't make that much of a difference.
                      
                      And it's very hard to have a culture of quality that
                      doesn't get overrun by zealots who will bankrupt you
                      while they squeeze the last 0.001% of performance out of
                      your product before releasing.    It is very had to have a
                      culture of quality that does the important things and
                      doesn't do the unimportant ones.
                      
                      The people who obsess with quality go bankrupt and the
                      people who obsess with releasing make money. So that's
                      what we get.
                      
                      A very fine ability for evaluating quality mixed with
                      pragmatic choice for what and when to spend time on it is
                      rare.
       
                    Cyphusx wrote 9 hours 57 min ago:
                    The trade off they're talking about is to arrive at the
                    same end product.
                    
                    The reason games are typically released as "fetuses" is
                    because it reduces the financial risk.    Much like any
                    product, you want to get it to market as soon as is
                    sensible in order to see if it's worth continuing to spend
                    time and money on it.
       
                      mort96 wrote 7 hours 49 min ago:
                      And this really shouldn't surprise professionals in an
                      industry where everything's always about development
                      velocity and releasing Minimum Viable Products as quickly
                      into the market as possible.
       
          MattGaiser wrote 13 hours 41 min ago:
          Probably because many are purists. It is like how anything about
          improving Electron devolves into "you shouldn't use Electron."
          
          Many would consider this a bare minimum rather than something worthy
          of praise.
       
            mschuster91 wrote 13 hours 14 min ago:
            > Probably because many are purists. It is like how anything about
            improving Electron devolves into "you shouldn't use Electron."
            
            The Electron debate isn't about details purism, the Electron debate
            is about the foundation being a pile of steaming dung.
            
            Electron is fine for prototyping, don't get me wrong. It's an easy
            and fast way to ship an application, cross-platform, with minimal
            effort and use (almost) all features a native app can, without
            things like CORS, permission popups, browser extensions or god
            knows what else getting in your way.
            
            But it should always be a prototype and eventually be shifted to
            native applications because in the end, unlike Internet Explorer in
            its heyday which you could trivially embed as ActiveX and it
            wouldn't lead to resource gobbling, if you now have ten apps
            consuming 1GB RAM each just for the Electron base to run, now the
            user runs out of memory because it's like PHP - nothing is shared.
       
              zamadatix wrote 12 hours 29 min ago:
              Each person seems to have their own bugbear about Electron but I
              really doubt improving Electron to have shared instances a la
              WebView2 would make the much of a dent in the hate for it here.
       
              saratogacx wrote 12 hours 32 min ago:
              Removing layers is hard though, better to have electron host a
              WASM application which will become a new "native" that gets
              argued semantically.
       
              jauntywundrkind wrote 12 hours 33 min ago:
              Or these devs & users can migrate to a PWA. Which will have
              vastly less overhead. Because it is shared, because each of those
              10 apps you mention would be (or could be, if they have ok data
              architecture) tiny.
       
                mschuster91 wrote 9 hours 1 min ago:
                > Or these devs & users can migrate to a PWA
                
                PWAs have the problem that for every interaction with the "real
                world" they need browser approval. While that is for a good
                reason, it also messes with the expectations of the user, and
                some stuff such as unrestricted access to the file system isn't
                available to web apps at all.
       
          scsh wrote 13 hours 47 min ago:
          It's because shitting on game devs is the trendy thing these days,
          even among more technically inclined crowds unfortunately.  It seems
          like there's a general unwillingness to accept that game development
          is hard and you can't just wave the magic "optimize" wand at
          everything when your large project is also a world of edge cases. 
          But it seems like it should be that simple according to all the
          armchair game devs on the internet.
       
            red-iron-pine wrote 12 hours 21 min ago:
            the engineers disease:    "i'm smarter than you and I need to prove
            it, and we're so smart we wouldn't have shipped this code in the
            first place" bla bla bla
            
            also keep in mind that modern gaming generates more revenue than
            the movie industry, so it's in the interests of several different
            parties to denigrate or undermine any competing achievement --
            "Bots Rule Every Thing Around Me"
       
            jeffwask wrote 12 hours 34 min ago:
            For me it's not so much about shitting on game devs as it is about
            shitting on the ogres that run game companies. Any of us who have
            done development should understand we have little control over
            scope and often want to do more than the business allows us to.
       
              scsh wrote 11 hours 41 min ago:
              That is completely ok in my opinion.  It's just most discourse I
              come across treats the developers as complete amateurs who don't
              know what they're doing.  As someone who's a professional dev
              myself I just can't get behind bashing the people doing the
              actual work when I know we're all dealing with the same business
              realities, regardless of industry.
       
            embedding-shape wrote 12 hours 58 min ago:
            Meh, the same is true for almost every discussion on the internet,
            everyone is an expert armchair for whatever subject you come
            across, and when you ask them about their experience it boils down
            to "I read lots of Wikipedia articles".
            
            I mean I agree with you, that it is trendy and seemingly easy, to
            shit on other people's work, and at this point it seems to be a
            challenge people take up upon themselves, to criticise something in
            the most flowery and graphic way as possible, hoping to score those
            sweet internet points.
            
            Since maybe 6-7 years I stopped reading reviews and opinions about
            newly launched games completely, the internet audience (and
            reviewers) are just so far off base compared to my own perspective
            and experience that it have become less than useless, it's just
            noise at this point.
       
              AngryData wrote 6 hours 27 min ago:
              I wish many people's "expertise" atleast amounted to reading
              wikipedia. It seems for many that is too much and they either
              make crap up on the spot or latch onto whatever the first thing
              they find that will confirm their biases regardless of how true
              it is.
       
            buildbot wrote 13 hours 3 min ago:
            The level of work that goes into even “small” games is pretty
            incredible. When I was a grad student another student was doing
            their (thesis based, research focused) masters while working at EA
            on a streetfighter(?) game.
            
            The game programming was actually just as research focused and
            involved as the actual research. They were trying to figure out how
            to get the lowest latency and consistency for impact sounds.
       
            taeric wrote 13 hours 4 min ago:
            There has long been a trend that "software engineers" and "computer
            scientists" both have been rather uninterested in learning the
            strategies that gaming developers use.
            
            Really, the different factions in software development are a
            fascinating topic to explore.  Add embedded to the discussion, and
            you could probably start fights in ways that flat out don't make
            sense.
       
        renewiltord wrote 14 hours 9 min ago:
        Pretty cool. I think it’s completely normal to be under a crunch and
        just go with some standard practices under normal conditions. Cool that
        they went back and sorted it out afterwards!
        
        I’ve got to say. I do find it somewhat unusual that despite the fact
        that every HN engineer has John Carmack level focus on craftsmanship,
        about 1/100k here produce that kind of outcome.
        
        I don’t get it. All of you guys are good at pointing out how to do
        good engineering. Why don’t you make good things?
       
        dnrvs wrote 15 hours 5 min ago:
        too many arm chair game devs who think they know better in this thread
       
          ycombinatrix wrote 8 hours 55 min ago:
          "Don't 6x your game's install size for no measurable benefit to
          users"
          
          Wow! It looks like I do indeed know better.
       
          lordnikon001 wrote 10 hours 41 min ago:
          I think what irks people is the number one rule of optimization is to
          always measure
          
          You never assume something is an optimization or needed and never do
          hypothetical optimizations
          
          I can see why it would happen in this case though, gamedev is chaotic
          and you're often really pressed for time
       
            forrestthewoods wrote 9 hours 19 min ago:
            WebDevs who have build systems that take ten minutes and download
            tens of megabytes of JS and have hundreds of milliseconds of lag
            are sooooooooooooo not allowed to complain about game devs ever.
       
              Dylan16807 wrote 9 hours 15 min ago:
              Oh, at first I thought you were talking about websites doing that
              and I was going to say "sure, those people can't complain, but
              the rest of us can".
              
              Then I realized you said build systems and eh, whatever.  It's
              not good for build systems to be bloated, but it matters a lot
              less than the end product being bloated.
              
              And you seem to be complaining about the people that are dealing
              with these build systems themselves, not inflicting them on other
              people?  Why don't they get to complain?
       
                bigstrat2003 wrote 6 hours 42 min ago:
                > Oh, at first I thought you were talking about websites doing
                that
                
                I'm pretty sure that is in fact what he meant, and that "have
                build systems" is a typo of "have built systems".
       
                forrestthewoods wrote 8 hours 3 min ago:
                Download bloat is net less impactful than build time bloat
                imho. Game download and install size bloat is bad. But is a
                mostly one time cost. Build time bloat doesn’t directly
                impact users, but iteration time is GodKing so bad build times
                indirectly hurt consumers.
                
                But that’s all beside the point. What I was really doing was
                criticizing the  HN commenters. HN posters are mostly webdevs
                because most modern programmers are webdevs. And while I
                won’t say the file bloat here wasn’t silly, I wonder stand
                for game dev slander from devs that commit faaaaaaaaaaaaaar
                greater sins.
       
                  Dylan16807 wrote 6 hours 11 min ago:
                  Web devs are not a hivemind.  That kind of criticism doesn't
                  work well at all when pointed at the entirety of the site.
                  
                  > Download bloat is net less impactful than build time bloat
                  imho.
                  
                  Download bloat is a bad problem for people on slow
                  connections, and there's a lot of people on slow connections.
                   I really dislike when people don't take that into account.
                  
                  And even worse if they're paying by the gigabyte in a country
                  with bad wireless prices, that's so much money flushed down
                  the drain.
       
                    forrestthewoods wrote 5 hours 31 min ago:
                    Believe you me I wish every website worked on 2G. HN is
                    great at least.
                    
                    For consoles total disk space is an even bigger constraint
                    than download size. But file size is a factor. Call of Duty
                    is known to be egregious. It’s a much more complex
                    problem than most people realize. Although hopefully not as
                    trivial a fix as Helldivers!
                    
                    In any case HN has a broadly dismissive attitude towards
                    gamedevs. It is irksome. And deeply misplaced. YMMV.
       
        everdrive wrote 16 hours 12 min ago:
        I love Helldivers 2, but from what I can tell it's a bunch of
        enthusiasts using a relatively broken engine to try to do cool stuff.
        It almost reminds me of the first pokemon game. I'll bet there's all
        sorts of stuff they get wrong from a strictly technical standpoint. I
        love the game so much I see this more as a charming quirk than I do
        something which really deserves criticism. The team never really
        expected their game to be as popular as it's become, and I think we're
        still inheriting flaws from the surprise interest in the game. (some of
        this plays out in the tug of war between the dev team's hopes for a
        realistic grunt fantasy vs. and the player base's horde power fantasy.)
       
          embedding-shape wrote 14 hours 28 min ago:
          This would make sense if it was a studio without experience, and
          without any external help, but their publisher is Sony Interactive
          Entertainment, which also provides development help when needed,
          especially optimizations and especially for PS hardware. SIE seems to
          have been deeply involved with Helldivers 2, doubling the budget and
          doubling the total development time. Obviously it was a good choice
          by SIE, it paid off, and of course there is always 100s of more
          important tasks to do before launching a game, but your comment reads
          like these sort of problems were to be expected because the team
          started out small and inexperienced or something.
       
            shadowgovt wrote 13 hours 58 min ago:
            Sony also published No Man's Sky.
            
            I'm not sure having the support of Sony is that gold-standard
            imprint that people think it is.
       
              embedding-shape wrote 13 hours 2 min ago:
              No Man's Sky didn't have technical issues at launch though, it
              ran fine for what is was. The problem with NMS was that people
              were told it would be a completely different experience compared
              to what it ended up being (at launch).
       
            everdrive wrote 14 hours 17 min ago:
            >but your comment reads like these sort of problems were to be
            expected because the team started out small and inexperienced or
            something.
            
            More or less nothing is optimized these days, and game prices and
            budgets have gone through the roof. Compared to the other games
            available these days (combined with how fun the game is) I
            definitely give HD2 a big pass on a lot of stuff. I'm honestly
            skeptical of Sony's involvement being a benefit, but that's mostly
            due to my experience regarding their attempts to stuff a PSN ID
            requirement into HD2 as well as their general handling of their
            IPs. (Horizon Zero Dawn is not only terrible, but they seem to try
            to force interest with a new remake on a monthly basis.)
       
              embedding-shape wrote 14 hours 2 min ago:
              > More or less nothing is optimized these days
              
              Not true, lots of games are optimized, but it's one of those
              tasks that almost no one notices when you do it great, but
              everyone notices when it's missing, so it's really hard to tell
              by just consuming ("playing") games.
              
              > I'm honestly skeptical of Sony's involvement being a benefit
              
              I'm not, SIE have amazing engineers, probably the best in the
              industry, and if you have access to those kind of resources, you
              use it. Meanwhile, I agree that executives at Sony sometimes have
              no clue, but that doesn't mean SIE helping you with development
              suddenly has a negative impact on you.
       
                everdrive wrote 13 hours 50 min ago:
                >Not true, lots of games are optimized,
                
                I don't mean this is a counter-argument -- I'm really
                interested. What are some good examples of very recent
                optimized games?
       
                  embedding-shape wrote 12 hours 55 min ago:
                  BF6 comes to mind, out of newly released games. Arc Raiders
                  too, seems to have avoided the heap of criticism because of
                  performance, meaning it is probably optimized enough so
                  people don't notice issues. Dyson Sphere Program (yet to be
                  released) is a bit older, and indie, but very well optimized.
       
                    everdrive wrote 12 hours 44 min ago:
                    Thanks for the list -- now that you mention it, I recall
                    being quite surprised to learn that Arc Raiders was not
                    only an UE5 game but would also run nicely on my PC. (I
                    haven't played it, but a friend asked me to consider it)
                    Now that you mention it as well, I think I recall the BF6
                    folks talking specifically about not cramming too many
                    graphical techniques into their games so that people could
                    actually play the game.
                    
                    Thanks for the list!
       
                      embedding-shape wrote 12 hours 40 min ago:
                      > I recall being quite surprised to learn that Arc
                      Raiders was not only an UE5 game but would also run
                      nicely on my PC
                      
                      Yeah, Unreal Engine (5 almost specifically) is another
                      example of things that are unoptimized by default, very
                      easy to notice, but once you work on it, it becomes
                      invisible and it's not that people suddenly cheer, you
                      just don't hear complaints about it.
                      
                      It's also one of those platforms where there is a ton of
                      help available from Epic if you really want it, so you
                      can tune the defaults BEFORE you launch your game, but
                      hardly anyone seemingly does that, and then both
                      developers and users blame the engine, instead of blaming
                      the people releasing the game. It's a weird affair all
                      around :)
       
          FieryMechanic wrote 14 hours 54 min ago:
          A lot of people in the comments here don't seem to understand that it
          is a relatively small game company with an outdated engine. I am a
          lot more forgiving  of smaller organisations when they make mistakes.
          
          The game has semi-regular patches where they seem to fix some things
          and break others.
          
          The game has a lot of hidden mechanics that isn't obvious from the
          tutorial e.g. many weapons have different fire modes, fire rates and
          stealth is an option in the game. The game has a decent community and
          people friendly for the most part, it also has the "feature" of being
          able to be played for about 20-40 minutes and you can just put it
          down again for a bit and come back.
       
            123malware321 wrote 14 hours 32 min ago:
            considering it still cost 40$ for a 2 year old game, i think they
            are way beyond the excuse of small team low budget trying to make
            cool stuff. They have receive shit tons of money and are way to
            late trying to optimise the game. When it came out it ran so
            pisspoor i shelved it for a long time. Trying it recently its only
            marginally better. its really poorly optimised, and blaming old
            tech is nonsense.
            
            People make much more smooth and complex experiences in old
            engines.
            
            You need to know your engine as a dev and dont cross its limits at
            the costs of user-experiences and then blame your tools....
            
            The whole story about more data making load times better is utter
            rubbish. Its a sign of pisspoor resource management and usage. For
            the game they have, they should have realized a 130GB install is
            unacceptable. It's not like they have very elaborate environments.
            A lot of similar textures and structures everywhere.. its not like
            its some huge unique world like The Witcher or such games...
            
            There is an astronomical amount of information available for free
            on how to optimise game engines, loads of books, articles, courses.
            
            How much money do you think they have made so far?
            
            "Arrowhead Game Studios' revenue saw a massive surge due to
            Helldivers 2, reporting around $100 million in turnover and $76
            million in profit for the year leading up to mid-2025,
            significantly increasing its valuation and attracting a 15.75%
            investment from Tencent"
            
            75 million in profit but can't figure out how to optimise a game
            engine. get out.
       
              FieryMechanic wrote 12 hours 12 min ago:
              Compared to the bigger gaming studios they are small. In fact
              they are not that much larger than the company I work for (not a
              game studio).
              
              The fact it is un-optimised can be forgiven because the game has
              plenty of other positives so people like myself are willing to
              look over them.
              
              I've got a few hundred hours in the game (I play for maybe an
              hour in the evening) and for £35 it was well worth the money.
       
              the_af wrote 13 hours 27 min ago:
              What does the age of the game in years have to do with anything?
              
              A fun game is a fun game.
       
              shadowgovt wrote 13 hours 59 min ago:
              It costs $40 for a 2-year-old game because the market is bearing
              $40 for a 2-year-old game.
              
              If anything, it's a testament to how good a job they've done
              making the game.
       
                aftbit wrote 13 hours 42 min ago:
                The most recent Battlefield released at $80. Arc Raiders
                released at $40 with a $20 deluxe edition upgrade. I think $40
                for a game like Helldivers 2 is totally fair. It's a fun game,
                worth at least 4 to 8 hours of playtime.
       
                  debugnik wrote 12 hours 10 min ago:
                  > worth at least 4 to 8 hours of playtime.
                  
                  Is that supposed to be praise?
       
                    aftbit wrote 7 hours 30 min ago:
                    Ah sorry, I thought "at least" would carry this statement.
                    I've played Helldivers for more than 250 hours personally.
                    
                    For some reason, though, I tend to compare everything to
                    movie theater tickets. In my head (though it's not true
                    anymore), a movie ticket costs $8 and gives me 1 hour of
                    entertainment. Thus anything that gives me more than 1 hour
                    per $8 is a good deal.
                    
                    $40 / 4 => $10/hr
                    
                    $40 / 8 => $5/hr
                    
                    Thus, I think Helldivers is a good deal for entertainment
                    even if you only play it for under 10 hours.
       
                    everdrive wrote 11 hours 6 min ago:
                    It's a comment about cost-to-hourly-entertainment. eg: if
                    in the general sense you're spending $5-$10 per hour of
                    entertainment you're doing at least OK. I understand that a
                    lot of books and video games can far exceed this, but it's
                    just a general metric and a bit of a low bar to clear. (I
                    have a LOT more hours into the game so from my perspective
                    my $40 has paid quite well.)
       
                    entropie wrote 11 hours 49 min ago:
                    Its also wrong. With 10 hours of helldivers 2 you havent
                    seen much of the game at all.
                    
                    I played it a bit after release and have 230 hours. I liked
                    the game and it was worth my money.
       
                      aftbit wrote 7 hours 28 min ago:
                      Yeah, I meant "at least" 4-8 hours. Even if you get bored
                      and give up after that, you've gotten your money's worth,
                      in my opinion.
                      
                      I have almost 270 hours in Helldivers 2 myself. Like any
                      multiplayer game, it can expand to fill whatever amount
                      of time you want to dedicate to it, and there's always
                      something new to learn or try.
       
            heftig wrote 14 hours 47 min ago:
            The bad tutorial at least has some narrative justification. It's
            just a filter for people who are already useful as shock troops
            with minimal training.
       
              banannaise wrote 13 hours 1 min ago:
              Not only does the bad tutorial have an in-universe justification;
              the ways in which it is bad are actually significant to the
              worldbuilding in multiple ways.
              
              The missing information also encourages positive interactions
              among the community - newer players are expected to be missing
              lots of key information, so teaching them is a natural and
              encouraged element of gameplay.
              
              I stopped playing the game awhile ago, but the tutorial always
              struck me as really clever.
       
              FieryMechanic wrote 14 hours 43 min ago:
              I also think that the tutorial would be tedious if it went
              through too much of the mechanics. They show you the basics, the
              rest you pick up through trial and error.
       
                red-iron-pine wrote 12 hours 19 min ago:
                aye.  give me the 3 minute tutorial, not the 37 minute
                tutorial.
                
                i want to play the game, like now, and i'll read the forums
                after i figure out that i'm missing something imporant
       
          heftig wrote 15 hours 4 min ago:
          The game logic is also weird. It seems like they started with at
          attempt at a realistic combat simulator which then had lots of
          unrealistic mechanics added on top in an attempt to wrangle it into
          an enjoyable game.
          
          As an example for overly realistic physics, projectile damage is
          affected by projectile velocity, which is affected by weapon
          velocity. IIRC, at some point whether you were able to destroy some
          target in two shots of a Quasar Cannon or three shots depended on if
          you were walking backwards while you were firing, or not.
       
            embedding-shape wrote 14 hours 35 min ago:
            > depended on if you were walking backwards while you were firing
            
            That sounds like a bug, not an intentional game design choice about
            the game logic, and definitely unrelated to realism vs not realism.
            Having either of those as goals would lead to "yeah, bullet
            velocity goes up when you go backwards" being an intentional
            mechanic.
       
              heftig wrote 13 hours 43 min ago:
              To be clear, walking backwards (away from the target) reduced
              your bullet velocity relative to the target, reducing the damage
              you were doing and leading to you needing more shots.
       
                embedding-shape wrote 12 hours 57 min ago:
                And to be extra clear, either way, neither options makes me
                believe it was an intentional design choice.
       
                  thunderfork wrote 7 hours 21 min ago:
                  Systems-driven gameplay is an intentional design choice all
                  unto itself
       
          philistine wrote 15 hours 5 min ago:
          You put the nail on the head with the first Pokémon, but Helldivers
          2 is an order of magnitude smaller in the amateur-to-success ratio.
          
          Game Freak could not finish the project, so they had to be bailed by
          Nintendo with an easy-to-program game so the company could get some
          much needed cash (the Yoshi puzzle game on NES). Then years later,
          with no end to the game in sight, Game Freak had to stoop to
          contracting Creatures inc. to finish the game. Since they had no
          cash, Creatures inc. was paid with a portion of the Pokémon
          franchise.
          
          Pokémon was a shit show of epic proportions. If it had been an SNES
          game it would have been canceled and Game Freak would have closed.
          The low development cost of Game Boy and the long life of the console
          made Pokémon possible.
       
          chamomeal wrote 15 hours 23 min ago:
          The game is often broken but they’ve nailed the physics-ey feel so
          hard that it’s a defining feature of the game.
          
          When an orbital precision strike reflects off the hull of a factory
          strider and kills your friend, or eagle one splatters a gunship, or
          you get ragdolled for like 150m down a huge hill and then a
          devastator kills you with an impassionate stomp.
          
          Those moments elevate the game and make it so memorable and
          replayable. It feels like something whacky and new is around every
          corner. Playing on PS5 I’ve been blessed with hardly any
          game-breaking bugs or performance issues, but my PC friends have
          definitely been frustrated at times
       
            rimunroe wrote 13 hours 29 min ago:
            I think it has the best explosions in any game I've played too.
            They're so dang punchy. Combined with their atmospheric effects
            (fog and dust and whatnot) frantic firefights with bots look
            fantastic.
       
            speeder wrote 15 hours 2 min ago:
            All other games from the same studio have the same features.
            
            In fact, the whole point of their games is that they are coop games
            where is easy to accidentally kill your allies in hilarious
            manners. It is the reason for example why to cast stratagems you
            use complex key sequences, it is intentional so that you can make
            mistake and cast the wrong thing.
       
              aftbit wrote 13 hours 39 min ago:
              It's actually a really nice spell casting system. It lets you
              have a ton of different spells with only 4 buttons. It rewards
              memorizing the most useful (like reinforce). It gives a way for
              things like the squid disruptor fields or whatever they're called
              to mess with your muscle memory while still allowing spells. It
              would be way less interesting if it was just using spell slots
              like so many other games.
       
              heftig wrote 14 hours 57 min ago:
              The only wrong thing I've been throwing is the SOS Beacon instead
              of a Reinforce, which is just annoying, and not just once. It
              makes the game public if it was friends-only and gives it
              priority in the quick play queue. So that can't be it.
              
              The dialing adds friction to tense situations, which is okay as a
              mechanic.
       
            whalesalad wrote 15 hours 7 min ago:
            It's such a janky game. Definitely feels like it was built using
            the wrong tool for the job. Movement will get stuck on the most
            basic things. Climbing and moving over obstacles is always a yucky
            feeling.
       
          delichon wrote 15 hours 42 min ago:
          Thank you for your service in keeping the galaxy safe for managed
          democracy.
       
          rincebrain wrote 15 hours 45 min ago:
          A lot of things suddenly made sense when I learned their prior work
          was Magicka.
       
            Zarathruster wrote 8 hours 27 min ago:
            Yeah the "Crash to Desktop" comedy spell wasn't added to the game
            for no good reason.
            
            I do credit their sense of humor about it though.
       
            SpaceManNabs wrote 10 hours 7 min ago:
            Is that a negative? All of the "negative" things listed make me
            think that they are really cool and trying to learn stuff and
            challenge things.
       
            moritonal wrote 13 hours 14 min ago:
            Oh my, I loved that game! It's wild everyone's throwing shade at
            Helldivers whilst ignoring that it was an massive success because
            of how fun it is. I've said it before, Dev's are really bad at
            understanding the art of making Fun experiences.
       
            jfindper wrote 15 hours 5 min ago:
            I never played Magicka, but the reviews seem fine (76%
            GameRankings, 74/100 Metacritic, 8/10 EuroGamer, etc.)
            
            Was it a bad game? Or jankey? What parts of Helldivers are "making
            sense" now?
       
              darthcircuit wrote 14 hours 48 min ago:
              Not op, but magicka is a pretty fun game.
              
              You cast spells in a similar way as calling in strategems in hd2.
              
              The spell system was super neat. There’s several different
              elements (fire, air, water, earth, electricity, ice, ands maybe
              something else. It’s been a while since I played). Each element
              can be used on its own or is combinable. Different combinations
              would cast different spells. Fire+water makes steam for instance.
              Ice + air is a focused blizzard, etc.
              
              there’s hundreds to learn and that’s your main weapon in the
              game. There’s even a spell you can cast that will randomly kick
              someone you’re playing with out of the game.
              
              It’s great fun with friends, but can be annoying to play
              sometimes. If you try it, go with kb/m. It supports controller,
              but is way more difficult to build the spells.
       
                finalarbiter wrote 14 hours 31 min ago:
                > maybe something else
                
                Water, Life, Arcane, Shield, Lightning, Cold, Fire, and Earth.
                [0] It's worth noting that, though you can combine most of the
                elements to form new spells (and with compounding effects, for
                example wetting or steaming an enemy enhances lightning
                damage), you cannot typically combine opposites like
                lightning/ground, which will instead cancel out. Killed myself
                many times trying to cast lightning spells while sopping wet.
                
                In my experience, though, nobody used the element names—my
                friends and I just referred to them by their keybinds. QFASA,
                anyone?
                
                [0]
                
 (HTM)          [1]: https://magicka.fandom.com/wiki/Elements
       
                  jamesgeck0 wrote 14 hours 1 min ago:
                  This is the most Helldivers 2 part for me. Spells being
                  intentionally tricky to execute, combined with accidental
                  element interactions and "friendly fire."
       
            brainzap wrote 15 hours 34 min ago:
            oh no
       
        _aavaa_ wrote 17 hours 47 min ago:
        My takeaway is that it seems like they did NO benchmarking of their own
        before choosing to do all that duplication. They only talk about
        performance tradeoff now that they are removing it. Wild
       
          jayd16 wrote 13 hours 39 min ago:
          You can't bench your finished game before it exists and you don't
          really want to rock the boat late in dev, either.
          
          It was a fundamentally sound default that they revisited.  Then they
          blogged about the relatively surprising difference it happen to make
          in their particular game.  As it turns out the loading is CPU bound
          anyway, so while the setting is doing it's job, in the context of the
          final game, it happens to not be the bottle neck.
          
          There's also the movement away from HDD and disc drives in the player
          base to make that the case as well.
       
          dwroberts wrote 15 hours 4 min ago:
          It seems plausible to me that this strategy was a holdover from the
          first game, which shipped for PS4 and XBO
          
          I don’t know about the Xbox, but on PS4 the hard drive was
          definitely not fast at all
       
          whizzter wrote 15 hours 12 min ago:
          It's an valid issue, those of us who worked back in the day on
          GD/DVD,etc games really ran into bad loading walls if we didn't
          duplicate data for straight streaming.
          
          Data-sizes has continued to grow and HDD-seek times haven't gotten
          better due to physics (even if streaming probably has kept up), the
          assumption isn't too bad considering history.
          
          It's a good that they actually revisited it _when they had time_
          because launching a game, especially a multiplayer one, will run into
          a lot of breaking bugs and this (while a big one, pun intended) is
          still by most classifications a lower priority issue.
       
          Pannoniae wrote 16 hours 1 min ago:
          The good old "studios don't play their own games" strikes again :P
          
          Games would be much better if all people making them were forced to
          spend a few days each month playing the game on middle-of-the-road
          hardware. That will quickly teach them the value of fixing stuff like
          this and optimising the game in general.
       
            maccard wrote 14 hours 10 min ago:
            I've worked in games for close to 15 years, and every studio I've
            worked on we've played the game very regularly. My current team
            every person plays the game at least once a week, and more often as
            we get closer to builds.
            
            In my last project, the gameplay team played every single day.
            
            > Games would be much better if all people making them were forced
            to spend a few days each month playing the game on
            middle-of-the-road hardware
            
            How would playing on middle of the road hardware have caught this?
            The fix to this was to benchmark the load time on the absolute
            bottom end of hardware, with and without the duplicated logic.
            Which you'd only do once you have a suspicion that it's going to be
            faster if you change it...
       
            whizzter wrote 15 hours 3 min ago:
            People literally play the games they work on all the time, it's
            more or less what most do.
            
            Pay 2000$ for indie games so studios could grow up without being
            beholden to shareholders and we could perhaps get that "perfect"
            QA,etc.
            
            It's a fucking market economy and people aren't making pong level
            games that can be simply tuned, you really get what you pay for.
       
            Forgeties79 wrote 15 hours 29 min ago:
            They could have been lying I guess but I listened to a great
            podcast about the development of Helldivers 2 (I think it was
            gamemakers notebook) and one thing that was constantly brought up
            was as they iterated they forced a huge chunk of the team to sit
            down and play it. That’s how things like diving from a little bit
            too high ended up with you faceplanting and rag-dolling, tripping
            when jet packing over a boulder that you get a little too close to,
            etc. They found that making it comically realistic in some areas
            led to more unexpected/emergent gameplay that was way more
            entertaining. Turrets and such not caring if you’re in the line
            of fire was brought up I believe.
            
            That’s how we wound up with this game where your friends are as
            much of a liability as your enemies.
       
          Hendrikto wrote 16 hours 15 min ago:
          > our worst case projections did not come to pass. These loading time
          projections were based on industry data - comparing the loading times
          between SSD and HDD users where data duplication was and was not
          used. In the worst cases, a 5x difference was reported between
          instances that used duplication and those that did not. We were being
          very conservative and doubled that projection again to account for
          unknown unknowns.
          
          They basically just made the numbers up. Wild.
       
            fullstop wrote 15 hours 5 min ago:
            It's like the story of a young couple cooking their first Christmas
            ham.
            
            The wife cuts the end off of the ham before putting it in the oven.
             The husband, unwise in the ways of cooking, asks her why she does
            this.
            
            "I don't know", says the wife, "I did it because my mom did it."
            
            So they call the mom.  It turns out that her mother did it, so she
            did too.
            
            The three of them call the grandma and ask "Why did you cut the end
            off of the ham before cooking it?"
            
            The grandma laughs and says "I cut it off because my pan was too
            small!"
       
              bombcar wrote 14 hours 32 min ago:
              It's the corollary to Chesterton's Fence - don't remove it until
              you know why it's there, but also investigate why it's there.
       
              chrisweekly wrote 14 hours 55 min ago:
              Haha, cargo cult strikes again!
       
                01HNNWZ0MV43FF wrote 14 hours 20 min ago:
                For today's 10,000: [1] > The pop-culture cargo cult
                description, however, takes features of some cargo cults (the
                occasional runway) and combines this with movie scenes to yield
                an inaccurate and fictionalized dscription. It may be hard to
                believe that the description of cargo cults that you see on the
                internet is mostly wrong, but in the remainder of this article,
                I will explain this in detail.
                
 (HTM)          [1]: https://www.righto.com/2025/01/its-time-to-abandon-car...
       
            rjzzleep wrote 15 hours 48 min ago:
            On the flip side I don't remember who did it, but basically
            extracting textures on disk fixed all the performance issues UE5
            has on some benchmarks(sorry for being vague, but I can't find the
            source material right now). But their assumption is in fact a sound
            one.
       
              Normal_gaussian wrote 15 hours 29 min ago:
              Yes. Its quite common for games to have mods that repack textures
              or significantly tweak the UE5 config at the moment - and its
              very common to see users using it when it doesn't actually affect
              their use cases.
              
              As an aside, I do enjoy the modding community naming over
              multiple iterations of mods - "better loading" -> "better better
              loading" -> "best loading" -> "simplified loading" -> "x's
              simplified loading" -> "y's simplified loading" -> "z's better
              simplified loading". Where 'better' is often some undisclosed
              metric based on some untested assumptions.
       
          wongarsu wrote 16 hours 43 min ago:
          It's pretty standard to do that duplication for games on CD/DVD
          because seek times are so long. It probably just got carried over as
          the "obviously correct" way of doing things, since HDDs are like DVDs
          if you squint a bit
       
            Arrath wrote 9 hours 44 min ago:
            I had assumed the practice started to die off when installing games
            became dominant over streaming from the disc even on consoles.
            Seems I was wrong!
       
            jayd16 wrote 13 hours 33 min ago:
            The game does ship on disc for console, no?
       
              Teknoman117 wrote 11 hours 40 min ago:
              The current generation of consoles can’t play games directly
              off the disk. They have to be installed to local storage first.
       
          Xelbair wrote 17 hours 4 min ago:
          worse, in their post they basically said:
          
          >we looked at industry standard values and decided to double them
          just in case.
       
            red-iron-pine wrote 10 hours 58 min ago:
            this is one of the best selling games in history, and is emently
            popular across the globe.
            
            it had no serious or glaring impact to their bottom line.
            
            thus it was the right call, and if they didn't bother to fix it
            they'd still be rolling in $$$$
       
              ycombinatrix wrote 8 hours 59 min ago:
              All companies should also defraud & rug pull their customers.
              
              It will make them a lot of money and is thus the right call. Who
              cares about customers am I right? They'd still be rolling in
              $$$$.
       
            functionmouse wrote 17 hours 3 min ago:
            Some kind of evil, dark counterpart to Moore's law in the making
       
          djmips wrote 17 hours 10 min ago:
          A tale as old as time. Making decisions without actually profiling
          before, during and after implementing.
       
          Hikikomori wrote 17 hours 11 min ago:
          They used industry data to make the decision first to avoid potential
          multi minute load times for 10% or do of their players, hard to test
          all kinds of pc configurations. Now they have telemetry showing that
          it doesn't matter because another parallel task takes about as much
          time anyway.
       
            _aavaa_ wrote 14 hours 10 min ago:
            But they did NOT know it would lead to multi-minute load time. They
            did not measure a baseline.
            
            Instead they did blindly did extra work and 6x’ed the storage
            requirement.
       
            justsomehnguy wrote 15 hours 29 min ago:
            So they premature optimized for a wrong case.
            
            > multi minute load times
            
            23Gb  / 100mb / 60s = 3.92m
            
            So in the worst case when everything is loaded at once (how on a
            system with < 32Gb RAM?) it takes 4 minutes.
            
            Considering GTA whatever version could sit for 15 minutes at the
            loading screen because nobody bothered to check why - the industry
            could really say not to bother.
       
            whywhywhywhy wrote 16 hours 6 min ago:
            Maybe it's changed a lot statistically in the last few years but
            for long time PC gamers used to have the mantra of small SSD for
            the OS and large HDD for games if they're price conscious so I
            could see that being assumed to be much more normal during
            development.
       
              Dylan16807 wrote 9 hours 19 min ago:
              It's a shameful tragedy of the commons if you bloat your game 6x
              because you think your customers don't have enough SSD space for
              their active games.
       
          esrauch wrote 17 hours 20 min ago:
          It's very easy to accidentally get misleading benchmarking results in
          100 different ways, I wouldn't assume they did no benchmarking when
          they did the duplication.
       
          maccard wrote 17 hours 32 min ago:
          I've been involved in decisions like this that seem stupid and
          obvious. There's a million different things that could/should be
          fixed, and unless you're monitoring this proactively you're unlikely
          to know it hsould be changed.
          
          I'm not an arrowhead employee, but my guess is at some point in the
          past, they benchmarked it, got a result, and went with it. And that's
          about all there is to it.
       
            seg_lol wrote 11 hours 11 min ago:
            Performance profiling should be built into the engine and turned on
            at all times. Then this telemetry could be streamed into a system
            that tracks it across all builds, down to a specific scene. It
            should be possible to click a link on the telemetry server and
            start the game at that exact point.
       
              maccard wrote 9 hours 47 min ago:
              How would that help them diagnose a code path that wasn't ever
              being run (loading non duplicated assets on HDDs)?
       
                seg_lol wrote 15 min ago:
                > diagnose a code path that wasn't ever being run
       
            alias_neo wrote 16 hours 37 min ago:
            They admitted to testing nothing, they just [googled it].
            
            To be fair, the massive install size was probably the least of the
            problems with the game, it's performance has been atrocious, and
            when they released for xbox, the update that came with it broke the
            game entirely for me and was unplayable for a few weeks until they
            released another update.
            
            In their defense, they seem to have been listening to players and
            have been slowly but steadily improving things.
            
            Playing Helldivers 2 is a social thing for me where I get together
            online with some close friends and family a few times a month and
            we play some helldivers and have a chat, aside from that period
            where I couldn't play because it was broken, it's been a pretty
            good experience playing it on Linux; even better since I switched
            from nvidia to AMD just over a week ago.
            
            I'm glad they reduced the install size and saved me ~130GB, and I
            only had to download about another 20GB to do it.
       
            Xelbair wrote 17 hours 3 min ago:
            >These loading time projections were based on industry data -
            comparing the loading times between SSD and HDD users where data
            duplication was and was not used. In the worst cases, a 5x
            difference was reported between instances that used duplication and
            those that did not. We were being very conservative and doubled
            that projection again to account for unknown unknowns.
            
            >We now know that, contrary to most games, the majority of the
            loading time in HELLDIVERS 2 is due to level-generation rather than
            asset loading. This level generation happens in parallel with
            loading assets from the disk and so is the main determining factor
            of the loading time. We now know that this is true even for users
            with mechanical HDDs.
            
            they did absolutely zero  benchmarking beforehand, just went with
            industry haresay, and decided to double it just in case.
       
              FieryMechanic wrote 14 hours 46 min ago:
              They made a decision based on existing data. This isn't
              unreasonable as you are pretending, especially as PC hardware can
              be quite diverse.
              
              You will be surprised what some people are playing games on. e.g.
              I know people that still use Windows 7 on a AMD BullDozer rig.
              Atypical for sure, but not unheard of.
       
                red-iron-pine wrote 12 hours 14 min ago:
                i believe it.  hell i'm in F500 companies and virtually all of
                them had some legacy XP / Server 2000 / ancient Solaris box in
                there.
                
                old stuff is common, and doubly so for a lot of the world,
                which ain't rich and ain't rockin new hardware
       
                  FieryMechanic wrote 11 hours 31 min ago:
                  My PC now is 6 years old and I have no intention of upgrading
                  it soon. My laptop is like 8 years old and it is fine for
                  what I use it for. My monitors are like 10-12 years old (they
                  are early 4k monitors) and they are still good enough. I am
                  primarily using Linux now and the machine will probably last
                  me to 2030 if not longer.
                  
                  Pretending that this is an outrageous decision when the data
                  and the commonly assumed wisdom was that there were still a
                  lot of people using HDDs.
                  
                  They've since rectified this particular issue and there seems
                  to be more criticism of the company after fixing an issue.
       
              pixl97 wrote 14 hours 55 min ago:
              >they did absolutely zero benchmarking beforehand, just went with
              industry haresay, a [1] It was a real issue in the past with hard
              drives and small media assets. It's still a real issue even with
              SSDs. HDD/SSD IOPS are still way slower than contiguous reads
              when you're dealing with a massive amount of files.
              
              At the end of the day it requires testing which requires time at
              a time you don't have a lot of time.
              
 (HTM)        [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_f...
       
                the_af wrote 11 hours 32 min ago:
                This is not a good invokation of Chesterton's Fence.
                
                The Fence is a parable about understanding something that
                already exists before asking to remove it. If you cannot
                explain why it exists, you shouldn't ask to remove it.
                
                In this case, it wasn't something that already existed in their
                game. It was something that they read, then followed (without
                truly understanding whether it applied to their game), and upon
                re-testing some time later, realized it wasn't needed and
                caused detrimental side-effects. So it's not Chesterton's
                Fence.
                
                You could argue they followed a videogame industry practice to
                make a new product, which is reasonable. They just didn't
                question or test their assumptions that they were within the
                parameters of said industry practice.
                
                I don't think it's a terrible sin, mind you. We all take
                shortcuts sometimes.
       
                imtringued wrote 13 hours 55 min ago:
                It's not an issue with asynchronous filesystem IO. Again, async
                file IO should be the default for game engines. It doesn't take
                a genius to gather a list of assets to load and then wait for
                the whole list to finish rather than blocking on every tiny
                file.
       
                  pixl97 wrote 12 hours 19 min ago:
                  There are two different things when talking about application
                  behavior versus disk behavior.
                  
                  >wait for the whole list to finish rather than blocking on
                  every tiny file.
                  
                  And this is the point. I can make a test that shows exactly
                  what's going on here. Make a random file generator that
                  generates 100,000 4k files. Now, write them on hard drive
                  with other data and things going on at the same time. Now in
                  another run of the program have it generate 100,000 4k files
                  and put them in a zip.
                  
                  Now, read the set of 100k files from disk and at the same
                  time read the 100k files in a zip....
                  
                  One finishes in less than a second and one takes anywhere
                  from a few seconds to a few minutes depending on your disk
                  speeds.
       
              maccard wrote 16 hours 40 min ago:
              Nowhere in that does it say “we did zero benchmarking and just
              went with hearsay”. Basing things on industry data is solid -
              looking at the steam hardware surveys if a good way to figure out
              the variety of hardware used without commissioning your own
              reports. Tech choices are no different.
              
              Do you benchmark every single decision you make on every system
              on every project you work on? Do you check that redis operation
              is actually O(1) or do you rely on hearsay. Do you benchmark
              every single SQL query, every DTO, the overhead of the DI
              Framework, connection pooler, json serializer, log formatter? Do
              you ever rely on your own knowledge without verifying the
              assumptions? Of course you do - you’re human and we have to
              make some baseline assumptions, and sometimes they’re wrong.
       
              creshal wrote 16 hours 45 min ago:
              "Industry hearsay" in this case was probably Sony telling game
              devs how awesome the PS5's custom SSD was gonna be, and nobody
              bothered to check their claims.
       
                maccard wrote 16 hours 39 min ago:
                What are you talking about?
                
                This has nothing to do with consoles, and only affects PC
                builds of the game
       
                  creshal wrote 12 hours 52 min ago:
                  HD2 started as playstation exclusive, and was retargeted
                  mid-development for simultaneous release.
                  
                  So the PS5's SSD architecture was what developers were
                  familiar with when they tried to figure out what changes
                  would be needed to make the game work on PC.
       
                    Dylan16807 wrote 9 hours 22 min ago:
                    If what they were familiar with was a good SSD, then they
                    didn't need to do anything.  I don't see how anything Sony
                    said about their SSD would have affected things.
                    
                    Maybe you're saying the hearsay was Sony exaggerating how
                    bad hard drives are?  But they didn't really do that, and
                    the devs would already have experience with hard drives.
       
                      wtallis wrote 4 hours 39 min ago:
                      What Sony said about their SSD was that it enabled game
                      developers to not duplicate assets like they did for
                      rotating storage. One specific example I recall in Sony's
                      presentation was the assets for a mailbox used in a
                      Spider Man game, with hundreds of copies of that mailbox
                      duplicated on disk because the game divided Manhattan
                      into chunks and tried to have all the assets for each
                      chunk stored more or less contiguously.
                      
                      If the Helldivers devs were influenced by what Sony said,
                      they must have misinterpreted it and taken away an
                      extremely exaggerated impression of how much on-disk
                      duplication was being used for pre-SSD game development.
                      But Sony did actually say quite a bit of directly
                      relevant stuff on this particular matter when introducing
                      the PS5.
       
                        Dylan16807 wrote 3 hours 39 min ago:
                        Weird, since that's a benefit of any kind of SSD at
                        all.  The stuff their fancy implementation made
                        possible was per-frame loading, not just convenient
                        asset streaming.
                        
                        But uh if the devs didn't realize that, I blame them. 
                        It's their job to know basics like that.
       
                    maccard wrote 9 hours 45 min ago:
                    I don't really understand your point. You're making a very
                    definitive statement about how the PS5's SSD architecture
                    is responsible for this issue - when the isssue is on a
                    totally different platform, where they have _already_
                    attempted (poorly, granted) to handle the different
                    architectures.
       
                mary-ext wrote 16 hours 40 min ago:
                the industry hearsay is about concern of HDD load times tho
       
        rawling wrote 18 hours 3 min ago:
         [1] 282 comments
        
 (HTM)  [1]: https://news.ycombinator.com/item?id=46134178
       
          HelloUsername wrote 16 hours 21 min ago:
          
          
 (HTM)    [1]: https://news.ycombinator.com/item?id=46123009
       
        tetris11 wrote 8 days ago:
        I wonder if a certain Amelie-clad repacker noticed the same reduction
        in their release of the same game.
       
          debugnik wrote 17 hours 46 min ago:
          Helldivers 2 is an always-online game, you won't find it cracked.
       
            NullPrefix wrote 17 hours 7 min ago:
            such games can only get private servers
       
              debugnik wrote 16 hours 53 min ago:
              Yes, but those are rarely a thing for most live service games.
              Unless someone is working on a reimplementation of the entire
              server side, there's no point in offering or downloading pirate
              copies.
       
                simplyinfinity wrote 4 hours 10 min ago:
                There is - albeit a dwindling - community that does reimplement
                entire backends for mmo games. Look up the ragezone forums. I
                grew up around Mu Online private servers. And I'm sure in time
                a private server for HD 2 will appear if arrowhead don't
                release one themselves :)
       
          squigz wrote 18 hours 17 min ago:
          Fitgirl and Anna (of Anna's Archive) are modern day heroes.
       
        Calzifer wrote 8 days ago:
        I was curious if they optimized the download. Did it download the
        'optimized' ~150 GB and wasting a lot of time there or did it download
        the ~20 GB unique data and duplicated as part of the installation.
        
        I still don't know but found instead an interesting reddit post were
        users found and analyzed this "waste of space" three month ago. [1] PS:
        just found it. According to this Steam discussion it does not download
        the duplicate data and back then it only blew up to ~70 GB.
        
 (HTM)  [1]: https://www.reddit.com/r/Helldivers/comments/1mw3qcx/why_the_g...
 (HTM)  [2]: https://steamcommunity.com/app/553850/discussions/0/4372501943...
       
          maccard wrote 17 hours 24 min ago:
          Steam breaks your content into 1MB Chunks and compresses/dedupes them
          [0]
          
          [0]
          
 (HTM)    [1]: https://partner.steamgames.com/doc/sdk/uploading#AppStructur...
       
          SergeAx wrote 8 days ago:
          They downloaded 43 GB instead of 152 GB, according to SteamDB: [1]
          Now it is 20 GB => 21 GB.
          
 (HTM)    [1]: https://steamdb.info/app/553850/depots/
       
        andrewstuart wrote 8 days ago:
        How is there so much duplication?
       
          crest wrote 17 hours 23 min ago:
          The idea is to duplicate assets so loading a "level" is just
          sequential reading from the file system. It's required on optical
          media and can be very useful on spinning disks too. On SSDs it's
          insane. The logic should've been the other way around. Do a speed
          test on start an offer to "optimise for spinning media" if the
          performance metrics look like it would help.
          
          If the game was ~20GB instead of ~150GB almost no player with the
          required CPU+GPU+RAM combination would be forced to put it on a HDD
          instead of a SSD.
       
            immibis wrote 14 hours 25 min ago:
            This idea of one continuous block per level dates back to the PS1
            days.
            
            Hard drives are much, much faster than optical media - on the order
            of 80 seeks per second and 300 MB/s sequential versus, like, 4
            seeks per second and 60 MB/s sequential (for DVD-ROM).
            
            You still want to load sequential blocks as much as possible, but
            you can afford to have a few. (Assuming a traditional engine
            design, no megatextures etc) you probably want to load each texture
            from a separate file, but you can certainly afford to load a block
            of grass textures, a block of snow textures, etc. Also throughput
            is 1000x higher than a PS1 (300 kB/s) so you can presumably afford
            to skip parts of your sequential runs.
       
          breve wrote 8 days ago:
          They duplicate files to reduce load times. Here's how Arrowhead Game
          Studios themselves tell it:
          
 (HTM)    [1]: https://www.arrowheadgamestudios.com/2025/10/helldivers-2-te...
       
            imtringued wrote 8 days ago:
            I don't think this is the real explanation. If they gave the
            filesystem a list of files to fetch in parallel (async file IO),
            the concept of "seek time" would become almost meaningless. This
            optimization will make fetching from both HDDs and SSDs faster.
            They would be going out of their way to make their product worse
            for no reason.
       
              jayd16 wrote 13 hours 21 min ago:
              The technique has the most impact on games running off physical
              disc.
              
              It's a well known technique but happened to not be useful for
              their use case.
       
              pixl97 wrote 14 hours 30 min ago:
              >If they gave the filesystem a list of files to fetch in parallel
              (async file IO)
              
              This does not work if you're doing tons of small IO and you want
              something fast.
              
              Lets say were on a HDD with 200IOPS and we need to read 3000
              small files randomly across the hard drive.
              
              Well, at minimum this is going to take 15's seconds plus any
              additional seek time.
              
              Now, lets say we zip up those files in a solid archive. You'll
              read it in half a second. The problem comes in when different
              levels all need different 3000 files. Then you end deduping a
              bunch of stuff.
              
              Now, where this typically falls apart for modern game assets is
              they are getting very large which tends to negate seek times by a
              lot.
       
                imtringued wrote 13 hours 25 min ago:
                I haven't found any asynchronous IOPS numbers on HDDS anywhere.
                The internet IOPs are just 1000ms/seek time with a 8ms seek
                time for moving from the outer to the inner track, which is
                only really relevant for the synchronous file IO case.
                
                For asynchronous IO you can just do inward/outward passes to
                amortize the seek time over multiple files.
                
                While it may not have been obvious, I have taken archiving or
                bundling of assets into a bigger file for granted. The obvious
                benefit is that the HDD knows that it should store game files
                continuously. This has nothing to do with file duplication
                though and is a somewhat irrelevant topic, because it costs
                nothing and only has benefits.
                
                The asynchronous file IO case for bundled files is even better,
                since you can just hand over the internal file offsets to the
                async file IO operations and get all the relevant data in
                parallel so your only constraint is deciding on an optimal
                lower bound for the block size, which is high for HDDs and low
                for SSDs.
       
                  pixl97 wrote 12 hours 29 min ago:
                  >I haven't found any asynchronous IOPS numbers on HDDS
                  anywhere.
                  
                  As the other user stated, just look up Crystal Disk Info
                  results for both HDDs and SSD and you'll see hard drives do
                  about 1/3rd of a MBPs on random file IO while the same hard
                  drive will do 400MBps on a contiguous read. For things like
                  this reading a zip and decompressing in memory is "typically"
                  (again, you have to test this) orders of magnitude faster.
       
                  gruez wrote 13 hours 2 min ago:
                  >I haven't found any asynchronous IOPS numbers on HDDS
                  anywhere. The internet IOPs are just 1000ms/seek time with a
                  8ms seek time for moving from the outer to the inner track,
                  which is only really relevant for the synchronous file IO
                  case.
                  
                  >For asynchronous IO you can just do inward/outward passes to
                  amortize the seek time over multiple files.
                  
                  Here's a random blog post that has benchmarks for a 2015 HDD:
                  [1] It shows 1.5MB/s for random 4K performance with high
                  queue depth, which works out to just under 400 IOPS. 1 queue
                  depth (so synchronous) performance is around a third.
                  
 (HTM)            [1]: https://davemateer.com/2020/04/19/Disk-performance-C...
       
              extraduder_ire wrote 18 hours 1 min ago:
              "97% of the time: premature optimization is the root of all
              evil."
       
              toast0 wrote 18 hours 8 min ago:
              Solid state drives tend to respond well to parallel reads, so
              it's not so clear. If you're reading one at a time, sequential
              access is going to be better though.
              
              But for a mechanical drive, you'll get much better throughput on
              sequential reads than random reads, even with command queuing. I
              think earlier discussion showed it wasn't very effective in this
              case and taking 6x the space for a marginal benefit for the small
              % of users with mechanical drives isn't worth while...
       
                seg_lol wrote 11 hours 5 min ago:
                Every storage medium, including ram, benefits from sequential
                access. But it doesn't have to be super long sequential access,
                the seek time, or block open time just needs to be short
                relative to the next block read.
       
              Xss3 wrote 8 days ago:
              If they fill your harddrive youre less likely to install other
              games. If you see a huge install size youre less likely to
              uninstall with plans to reinstall later because thatd take a long
              time.
       
                ukd1 wrote 17 hours 39 min ago:
                Unfortunately this actually is believable. SMH.
       
          jy14898 wrote 8 days ago:
          The post stated that it was believed duplication improved loading
          times on computers with HDDs rather than SSDs
       
            khannn wrote 10 hours 48 min ago:
            Who cares? I've installed every graphically intensive game on SSDs
            since the original OCZ Vertex was released.
       
              teamonkey wrote 9 hours 17 min ago:
              Their concern was that one person in a squad loading on HDD could
              slow down the level loading for all players in a squad, even if
              they used a SSD, so they used a very normal and time-tested
              optimisation technique to prevent that.
       
                khannn wrote 8 hours 55 min ago:
                Their technique makes it so that the normal person with a ~base
                SSD of 512 GB can't reasonably install the game. Heck of a job
                Brownie.
       
                  teamonkey wrote 8 hours 9 min ago:
                  Nonsense. I play it on a 512GB SSD and it’s fine.
       
            pjc50 wrote 18 hours 17 min ago:
            Key word is "believed". It doesn't sound like they actually
            benchmarked.
       
              wongogue wrote 18 hours 14 min ago:
              There is nothing to believe. Random 4K reads for HDD is slow.
       
                debugnik wrote 17 hours 9 min ago:
                I assume asset reads nowadays are much heavier than 4 kB
                though, specially if assets meant to be loaded together are
                bundled together in one file. So games now should be spending
                less time seeking relative to their total read size. Combined
                with HDD caches and parallel reads, this practice of
                duplicating over 100 GBs across bundles is most likely a
                cargo-cult by now.
                
                Which makes me think: Has there been any advances in disk
                scheduling in the last decade?
       
            dontlaugh wrote 8 days ago:
            Which is true. It’s an old technique going back to CD games
            consoles, to avoid seeks.
       
              SergeAx wrote 8 days ago:
              Is it really possible to control file locations on HDD via
              Windows NTFS API?
       
                toast0 wrote 17 hours 57 min ago:
                Not really. But when you write a large file at once (like with
                an installer), you'll tend to get a good amount of sequential
                allocation (unless your free space is highly fragmented). If
                you load that large file sequentially, you benefit from drive
                read ahead and OS read ahead --- when the file is fragmented,
                the OS will issue speculative reads for the next fragment
                automatically and hide some of the latency.
                
                If you break it up into smaller files, those are likely to be
                allocated all over the disk; plus you'll have delays on reading
                because windows defender makes opening files slow. If you have
                a single large file that contains all resources, even if that
                file is mostly sequential, there will be sections that you
                don't need, and read ahead cache may work against you, as it
                will tend to read things you don't need.
       
                dontlaugh wrote 8 days ago:
                No, not at all. But by putting every asset a level (for
                example) needs in the same file, you can pretty much guarantee
                you can read it all sequentially without additional seeks.
                
                That does force you to duplicate some assets a lot. It's also
                more important the slower your seeks are. This technique is
                perfect for disc media, since it has a fixed physical size (so
                wasting space on it is irrelevant) and slow seeks.
       
                  viraptor wrote 18 hours 7 min ago:
                  > by putting every asset a level (for example) needs in the
                  same file, you can pretty much guarantee you can read it all
                  sequentially
                  
                  I'd love to see it analysed. Specifically, the average number
                  of nonseq jumps vs overall size of the level. I'm sure you
                  could avoid jumps within megabytes. But if someone ever got
                  closer to filling up the disk in the past, the chances of
                  contiguous gigabytes are much lower. This paper effectively
                  says that if you have long files, there's almost guaranteed
                  gaps [1] so at that point, you may be better off
                  preallocating the individual does where eating the cost of
                  switching between them.
                  
 (HTM)            [1]: https://dfrws.org/wp-content/uploads/2021/01/2021_AP...
       
                    jayd16 wrote 13 hours 18 min ago:
                    It's an optimistic optimization so it doesn't really matter
                    if the large blobs get broken up.  The idea is that it's
                    still better than 100k small files.
       
                    justsomehnguy wrote 15 hours 23 min ago:
                    >  But if someone ever got closer to filling up the disk in
                    the past, the chances of contiguous gigabytes are much
                    lower
                    
                    Someone installing a 150GB game sure do have 150GB+ of free
                    space and there would be a lot of continuous free space.
       
                    wcoenen wrote 17 hours 24 min ago:
                    > But if someone ever got closer to filling up the disk in
                    the past, the chances of contiguous gigabytes are much
                    lower.
                    
                    By default, Windows automatically defragments filesystems
                    weekly if necessary. It can be configured in the
                    "defragment and optimize drives" dialog.
       
                      pixl97 wrote 12 hours 59 min ago:
                      Not 'full' de-fragmentation, Microsoft labs did a study
                      and after 64MB slabs of contiguous files you don't gain
                      much so they don't care about getting gigabytes fully
                      defragmented. [1] old article on the process
                      
 (HTM)                [1]: https://web.archive.org/web/20100529025623/http:...
       
                    toast0 wrote 17 hours 29 min ago:
                    From that paper, table 4, large files had an average # of
                    fragments around 100, but a median of 4 fragments. A
                    handful of fragments for a 1 GB level file is probably a
                    lot less seeking than reading 1 GB of data out of a 20 GB
                    aggregated asset database.
                    
                    But it also depends on how the assets are organized, you
                    can probably group the level specific assets into a
                    sequential section, and maybe shared assets could be
                    somewhat grouped so related assets are sequential.
       
                    dontlaugh wrote 18 hours 2 min ago:
                    Sure. I’ve seen people that do packaging for games
                    measure various techniques for hard disks typical of the
                    time, maybe a decade ago. It was definitely worth it then
                    to duplicate some assets to avoid seeks.
                    
                    Nowadays? No. Even those with hard disks will have lots
                    more RAM and thus disk cache. And you are even guaranteed
                    SSDs on consoles. I think in general no one tries this
                    technique anymore.
       
        habbekrats wrote 8 days ago:
        it seems wild the state of games and development today... imagine 131GB
        out of 154GB of data was not needed....
       
          high_na_euv wrote 16 hours 47 min ago:
          It was needed. Just the trade off wasnt worth it.
       
            red-iron-pine wrote 10 hours 56 min ago:
            it wasn't needed -- need means "must have"
            
            they're a fantastically popular franchise with a ton of money...
            and did it without the optimizations.
            
            if they never did these optimizations they'd still have a hugely
            popular, industry leading game
            
            minor tweaks to weapon damage will do more to harm their bottom
            line compared to any backend optimization
       
            Tepix wrote 15 hours 42 min ago:
            I'd argue it was incompetence.
       
            Zambyte wrote 15 hours 59 min ago:
            It was wanted and intentionally selected, but it wasn't needed.
       
          maccard wrote 17 hours 28 min ago:
          This isn't unique to games, and it's not just "today". Go back a
          decade [0] find people making similar observations about one of the
          largest tech companies on the planet.
          
          [0]
          
 (HTM)    [1]: https://news.ycombinator.com/item?id=10066338
       
            lynnharry wrote 3 hours 51 min ago:
            > FB App is 114MB in size, but loading this page in Chrome will use
            a good 450MB, idk how they managed that.
            
            This reminds me of the old days when I check who's using my PC
            memory every now and then.
       
          wvbdmp wrote 8 days ago:
          The whole world took a wrong turn when we moved away from physical
          media.
       
            jayd16 wrote 13 hours 26 min ago:
            The (de)-optimization exists, essentially, because of physical
            media.
       
            tetris11 wrote 8 days ago:
            In terms of ownership, yes absolutely. In terms of read/write
            speeds to physical media, the switch to an SSD has been unsung
            gamechanger.
            
            That being said, cartridges were fast. The move away from
            cartridges was a wrong turn
       
              crote wrote 15 hours 12 min ago:
              > That being said, cartridges were fast. The move away from
              cartridges was a wrong turn
              
              Cartridges were also crazy expensive. A N64 cartridge cost about
              $30 to manufacture with a capacity of 8MB, whereas a PS1 CD-ROM
              was closer to a $1 manufacturing cost, with a capacity of 700MB.
              That's $3.75/MB versus $0.0014/MB - over 2600x more expensive!
              
              Without optical media most games from the late 90s & 2000s
              would've been impossible to make - especially once it got to the
              DVD era.
       
              BizarroLand wrote 8 days ago:
              I hate it when you buy a physical game, insert the disk, and
              immediately have to download the game in order to play the game
              because the disk only contains a launcher and a key. Insanity of
              the worst kind.
       
                maccard wrote 14 hours 7 min ago:
                The read speed off of an 8xDVD is ~10MB/s. The cheapest 500GB
                SSD on Amazon has a read speed of of 500MB/s. An NVMe drive has
                is 2500MB/s. We can read an entire DVD's capacity (4.7GB) from
                an SSD in under 10 seconds, compared to 8 minutes.
       
                hbn wrote 16 hours 0 min ago:
                Nintendo is pretty good for putting a solid 1.0 version of
                their games on the cartridges on release. But on the other
                hand, the Switch cartridges use NAND memory which means if you
                aren't popping them into a system to refresh the charge every
                once in a while, your physical cartridge might not last as long
                as they keep the servers online so you could download a digital
                purchase.
                
                I've kinda given up on physical games at this point. I held on
                for a long time, but the experience is just so bad now. They
                use the cheapest, flimsiest, most fragile plastic in the cases.
                You don't get a nice instruction manual anymore. And honestly,
                keeping a micro SD card in your system that can hold a handful
                of games is more convenient than having to haul around a bunch
                of cartridges that can be lost.
                
                I take solace in knowing that if I do still have a working
                Switch in 20 years and lose access to games I bought a long
                time ago, hopefully the hackers/pirates will have a method for
                me to play them again.
       
                  Dylan16807 wrote 9 hours 12 min ago:
                  Are you sure those flashes are capable of refreshing?
       
                  wtallis wrote 15 hours 39 min ago:
                  > the Switch cartridges use NAND memory which means if you
                  aren't popping them into a system to refresh the charge every
                  once in a while, your physical cartridge might not last as
                  long
                  
                  You've been paying attention to the wrong sources for
                  information about NAND flash. A new Switch cartridge will
                  have many years of reliable data retention, even just sitting
                  on a shelf. Data retention only starts to become a concern
                  for SSDs that have used up most of their write endurance; a
                  Switch cartridge is mostly treated as ROM and only written to
                  once.
       
                    hbn wrote 15 hours 35 min ago:
                    What's "many years"?
                    
                    I've read about people's 3DS cartridges already failing
                    just sitting on a shelf.
       
                      hbn wrote 2 hours 11 min ago:
                      Speak of the devil, I just got this tweet in my feed
                      today
                      
 (HTM)                [1]: https://x.com/marcdwyz/status/199922672332226152...
       
                crest wrote 17 hours 26 min ago:
                Or the launch day patch is >80% the size of the game, but I
                don't want to go back to game design limited by optical media
                access speeds.
       
            breve wrote 8 days ago:
            Hard drives and optical discs are the reason they duplicated the
            data. The duplicated the data to reduce load times.
       
              habbekrats wrote 8 days ago:
              do they even sell disc of these game?...
       
                jsheard wrote 16 hours 52 min ago:
                They do, but it's irrelevant to performance nowadays since
                you're required to install all of the disc data to the SSD
                before you can play. The PS3/360 generation was the last time
                you could play games directly from a disc (and even then some
                games had an install process).
       
       
 (DIR) <- back to front page