Post AzKViRI6rDHkvc8xKy by foone@digipres.club
 (DIR) More posts by foone@digipres.club
 (DIR) Post #AzKUBHz0LNv8Y6pu4W by foone@digipres.club
       2025-10-18T09:32:51Z
       
       1 likes, 0 repeats
       
       in a better world, sites would always clearly document how big file uploads can be, and that documentation would match reality.in this world, I'm currently bisecting a site's upload limit
       
 (DIR) Post #AzKUMYCcNdY82SZdCK by foone@digipres.club
       2025-10-18T09:34:55Z
       
       0 likes, 0 repeats
       
       the official sizes are 3MB, 5MB, and 10MB, for different types of files. in the real world, the actual size appears to be between 4.5MB and 5MB
       
 (DIR) Post #AzKUaTgZyDIM5IEFDE by foone@digipres.club
       2025-10-18T09:37:24Z
       
       0 likes, 0 repeats
       
       @barometz probably they are, yes. but I'm also intentionally trying to encode my file exactly how they encode files, to ensure maximum compatibility
       
 (DIR) Post #AzKViRI6rDHkvc8xKy by foone@digipres.club
       2025-10-18T09:50:02Z
       
       0 likes, 0 repeats
       
       @barometz I Think you're right though
       
 (DIR) Post #AzKVtqaf8TDGZtGFua by foone@digipres.club
       2025-10-18T09:52:09Z
       
       0 likes, 0 repeats
       
       okay so I can now confirm they're recompressing them on the back end, but it's weirder than it sounds.because I can download the file I uploaded, and it got BIGGER?
       
 (DIR) Post #AzKW9QGWmJ5TZBGods by foone@digipres.club
       2025-10-18T09:54:58Z
       
       0 likes, 0 repeats
       
       okay I think the limit is 5,000,000 bytes
       
 (DIR) Post #AzKWCxuq7kSWncobNg by foone@digipres.club
       2025-10-18T09:55:35Z
       
       0 likes, 0 repeats
       
       so a little under 4.77 megabytes
       
 (DIR) Post #AzKWGAJUUDpG7Ps5IG by green_bens@mastodon.green
       2025-10-18T09:55:47Z
       
       0 likes, 0 repeats
       
       @foone Sound like they’re doing something like Decompress fileScan for virusesDecompress file (perhaps with faster processing and less compression)
       
 (DIR) Post #AzKXqb62U8MTZ1W18S by zeitkapsl@mastodon.social
       2025-10-18T10:12:22Z
       
       0 likes, 0 repeats
       
       @foone we stumbled upon a similar problem in our thumbnail generation as well: When resizing and compressing a e.g. 4032x3024 HEIC image to 1920 long edge WEBP, the resulting smaller image would be larger than the original on Safari. Turns out Safari completely ignores the quality-parameter in the WEBP-encoder, effectively turning off the lossy compression completely
       
 (DIR) Post #AzKXqc8YcFyCn7hXhQ by foone@digipres.club
       2025-10-18T10:13:55Z
       
       0 likes, 0 repeats
       
       @zeitkapsl oh, fun!
       
 (DIR) Post #AzKZ4KwrQ5IhURKznU by pointingdevice@social.restless.systems
       2025-10-18T10:27:33Z
       
       0 likes, 0 repeats
       
       @foone At least 4chan used to losslessly process JPEG files with jpegtran to remove all metadata, but without the -optimize parameter so every file got recompressed at baseline and became larger. If you played the fixed Huffman table right, you could easily end up with a 300 MB file. But at some point, they changed it so that the size cap applied after this cleansing step.