[HN Gopher] Google can't pass its own page speed test
___________________________________________________________________
Google can't pass its own page speed test
Author : EvgeniyZh
Score : 341 points
Date : 2021-06-05 08:45 UTC (14 hours ago)
(HTM) web link (www.reddit.com)
(TXT) w3m dump (www.reddit.com)
| fauigerzigerk wrote:
| _> Cumulative layout shift is supposed to measure whether your
| site shifts around while it's loading, but it's so strict that
| even Google Translate fails the test._
|
| And so it should be. There's hardly anything worse than having
| things shift around after you start reading, scrolling or
| clicking on something.
|
| I think it's a very good sign if Google's own sites fail these
| tests. It means the tests are meaningful.
| swiley wrote:
| Google's pages are some of the worst for this.
|
| It's not hard to build pages without these problems but
| everyone is way too caught up in fancy web design they ignore
| UX.
| atkbrah wrote:
| I think sometimes windows settings (the modern settings
| application) fails at this too.
| adkadskhj wrote:
| > And so it should be. There's hardly anything worse than
| having things shift around after you start reading, scrolling
| or clicking on something.
|
| Oh my god, this is one of my largest complaints. I'm an
| impatient person so i often go to click something before the
| page is done - bad me - and so often JS loads and swaps
| buttons.
|
| Honestly though i've had that complaint about my OS', too. The
| number of times i'm typing and something pops up and steals
| focus or i go to click something and moments before it shifts
| causing me to click something else.. ugh.
| kmeisthax wrote:
| If I was a native widget toolkit designer or web developer
| I'd implement a rule that all layout changes must be rendered
| and on the screen 150ms before input handling can use them.
| Lag compensation for local user input. If you click a button
| at the same time it moves, you should still click what was on
| the screen at the time you pressed the button.
|
| Yes, I know this effectively means having to keep two copies
| of the widget tree (one for input and one for output) and
| applying all changes as diffs. Don't care. I'll buy more RAM.
| reaperducer wrote:
| When I switched to Mac, that was one of the most refreshing
| things about OSX: apps weren't allowed to steal focus.
|
| Sadly, that's no longer true.
| ancarda wrote:
| Is that a very recent change? I don't know that I've ever
| noticed that on macOS, but I started using this OS around
| Lion, I think
| Alex3917 wrote:
| > There's hardly anything worse than having things shift around
| after you start reading, scrolling or clicking on something.
|
| Most sites that fail for CLS don't have any CLS that's visible
| to the user.
| dazc wrote:
| I don't think anyone would disagree but, as things stand, you
| can have a page where things DO NOT visibly 'shift around' but
| still fail the test.
|
| I feel sorry for all the developers who are going to be
| inundated with complaints because their client's sites have
| failed yet another perceived important benchmark.
| epistasis wrote:
| Visible to whom?
|
| Web developers often have no clue how their page appears to
| others, because they are developing locally, and the
| production site is close and has few network issues.
|
| So a lot of developers that think everything is fine end up
| having no clue what most people experience.
|
| When web fonts first started seeing widespread use, I would
| very frequently see 2-10 second delays when all content had
| loaded except for the font, including massive images, but I
| couldn't see any text. When I complained about this trend in
| the web to web developers I knew, almost none of them even
| seemed to believe it was possible, or at worst I was just a
| very unlucky user.
| rsynnott wrote:
| On the other hand, it means that Google has stopped caring
| about UI performance. Of course, anyone who's used Google
| Groups lately could tell you that.
| trasz wrote:
| Priority is the ads display performance; search or news are
| just an addition to the actual business.
| Tempest1981 wrote:
| My theory is that they test this on their 10 Gbit company
| network, and high-end laptops, and it works great. So it's
| hard to convince anyone that it's a problem.
| lrem wrote:
| In the office we are strongly encouraged to try our stuff
| with the WiFi network degraded to perform like EDGE
| (apparently nobody cares about GPRS any more). Obviously
| that hasn't been exercised much since last spring.
| refulgentis wrote:
| I'm not sure its reasonable to jump from "Google Translate
| has a jump' to "Google has stopped caring about UI
| performance"
| rsynnott wrote:
| As I say, compare Google Groups today to Google Groups 10
| years ago. Or gmail. Or Calendar. Though Groups is possibly
| the best example, as it went, more or less overnight, from
| one of Google's snappiest properties to about its most
| sluggish.
|
| Google used to be very concerned about UI performance, but
| they seem to have totally lost interest in it.
| sim_card_map wrote:
| or Gmail for the last 10 years
| aparsons wrote:
| YouTube does this notorious thing where the ad above the first
| search result appears after the results, so you accidentally
| click the ad.
| pclmulqdq wrote:
| It's almost as if they get paid per click on that ad...
|
| There's no incentive for them to fix this, unfortunately.
| Everywhere else on YT there's a nice gray box that keeps the
| place of a UI element until that element loads.
| zelphirkalt wrote:
| I feel like there should be additional constraints. For example
| a page that looks so simple like the google search page should
| have a limit on how much code it needs to get there. Basically
| a bloat alarm, lighting up in red, once bloat is detected. And
| it would be redder than red for Google itself and things like
| YouTube.
| daed wrote:
| Gmail shifts almost every time I click on the Promotions tab. I
| go to click my top unread email and instead end up clicking an
| ad.
| tyingq wrote:
| _" I think it's a very good sign if Google's own sites fail
| these tests. It means the tests are meaningful."_
|
| I agree, that appears to be a good sign. I am curious to what
| degree Google will reward fast sites and punish slow ones. I
| wonder if they are willing to make big shifts in the rankings.
| hateful wrote:
| At my second job around 2004 my boss was very strict about not
| having the page move on load, it was a big deal. But over the
| years, especially recently, I've noticed so many sites and apps
| that shift constantly and it seems that it's not a priority
| anymore.
|
| Just yesterday, opening the Amazon app, and the "Orders" button
| shifted before I could click it!
| epistasis wrote:
| Web development has gone a long ways backwards over the
| years. The tech is so complicated that developers focus on
| their own productivity over basic user experience. It feels
| like it's been a long long long time since I've even seen a
| web article focus on good UI, instead it's all about how to
| handle all the layers of complexity or new tooling.
| bryanrasmussen wrote:
| is your old boss still around - how is his site?
| cecilpl2 wrote:
| Google _Search_ fails the cumulative layout shift test.
|
| I can't count the number of times the "People also search for"
| box pops up in the middle of my search results 3-4 seconds
| after load. It's just enough time for me to identify the result
| I want, move the mouse there, and then proceed click on the
| new, different link that just loaded under my mouse.
|
| It's infuriating.
| diveanon wrote:
| At this point you have to assume that is by design.
| rightbyte wrote:
| Ye. It is like those adds in Android apps that happen to
| load where the button you try to click was.
|
| It is obvious someone at Google is trying to game some
| metric with accidental clicks.
| tomcooks wrote:
| I don't understand the downvotes you are getting,
| considering how they used such tricks whenever possible
| maccard wrote:
| It's more likely the people who develop and test and
| prioritize the features live in mountain view, with 5g and
| gigabit fiber with ultra low latency to the data centers.
| dylan604 wrote:
| That's why the dev tools in browsers have an option to
| simulate slower data speeds to test this kind of thing.
| Having good connectivity and failing to provide a useable
| experience for slower connections is just bad dev work.
| catlifeonmars wrote:
| Or, it's a symptom of the ownership of the search
| experience being smeared out through a large
| organization.
| prollings wrote:
| Sure, but these people must have left that golden zone at
| some point and had their clicks stolen like the rest of
| us.
| Blikkentrekker wrote:
| > _3-4 seconds after load. It 's just enough time for me to
| identify the result I want, move the mouse there, and then
| proceed click on the new, different link that just loaded
| under my mouse._
|
| You have in duplex in one small fragment enunciated
| wonderfully why the rodent is never to be used for anything
| but video games and image editing.
| tomerv wrote:
| It took me several tries to understand this comment. In the
| first pass it looks like the result of some kind of Markov
| chain text generator.
| grey-area wrote:
| But A/B testing shows people love those links with that
| behaviour, they follow those links 73.523% of the time!
| Griffinsauce wrote:
| You kid but I've seen product teams put interaction metrics
| like this as OKRs and _ruin_ their product.
|
| "Data driven" development so often forgets about common
| fucking sense.
| marcosdumay wrote:
| It's not common sense, contesting your priors and
| verifying your assumptions is one of the hardest and most
| important parts of doing data-driven science.
|
| It's also not surprising that when you take a set of
| random people without science training, they'll just
| cargo-cult the most visible parts and forget about the
| hidden, essential ones. It should also not surprise
| anybody what part they forget about, since the calgo-cult
| speech is literally about this exact problem, but with
| trained scientists (did I say it was hard?) instead of
| random people.
| phamilton wrote:
| A fun story from a friend at Spotify: one metric they
| tracked was the time between sign up and first song play.
| Then one team decided to automatically start playing a
| song after sign up. They knocked that OKR out of the
| park!
| karmakaze wrote:
| Did these tests rule out that people just click on links on
| that area of the page?
| anonymousab wrote:
| The actual motivation or causation doesn't matter. All
| that matters is that some team or product lead can
| justify a decision or a promotion, or even just an
| ideology, using the data in some way.
| lstamour wrote:
| That's the joke.
| frosted-flakes wrote:
| The YouTube app for Android even does this. I have slow
| Internet, so I always read the description while the video
| loads. But often when I go to click the description box to
| expand it, a banner ad loads in its place and pushes the
| description down.
| janci wrote:
| I'm starting to think this is on purpose. Sometimes I get
| tricked twice, after pressing back and immediately going to
| click the correct link this little bastard materializes under
| the cursor.
| roblabla wrote:
| Yeah, I have a custom rule in ublock origin to remove it.
| It's literally the only custom rule I have, but it happened
| to me _so damn often_ that it ended up being worth the time
| to identify the element so I could permanently block it.
|
| In case anyone ever needs it:
| google.com##div[id^="eob_"]
| Traubenfuchs wrote:
| That box is obviously and clearly malicious.
| catlifeonmars wrote:
| I don't know if I would go that far. Do you have evidence
| of this?
| oblio wrote:
| It's the most used webpage on the planet and for sure
| their 300 IQ engineers run into this daily.
|
| That entire page probably had tens of thousands of UX
| work put into it.
|
| If it works like that, it's intentional.
| foobiter wrote:
| do you think they penalize themselves for it?
| [deleted]
| amanzi wrote:
| It infuriates me when Google search result pages jump around as
| I'm trying to click on links!
| baybal2 wrote:
| The cumulative amount of popups on Google websites, and apps has
| been unbearable for quite some time.
| tonetheman wrote:
| It would be nice if any of the web vitals were actually tied to
| conversion or bounce rates.
|
| I have used the web for a long time and how long a page loads
| generally does not matter to me. Perhaps I am alone in that? No
| amount of javascript squishing will matter if the network from
| you to the origin site is slow to start with.
| awinter-py wrote:
| no website hoses a browser / laptop faster than the adwords
| dashboard
|
| I looked into it once -- it's some kind of shitty CPU bound loop
| in dart-to-js logic
|
| we're talking 10-second pauses on an older mac, full crashes on a
| linux laptop prone to OOM. This is for a site that does one or
| two basic crud actions. Few other websites do this and adwords is
| the most consistent offender, i.e. the worst.
| katzgrau wrote:
| Not totally surprising. A lot of things Google is officially
| opposed to concerning their "Better Ads Coalition" are things you
| routinely see in Google Search and Google Display Network.
|
| From the outside, Google looks more and more like an indifferent,
| multi-headed monster with competing agendas by the day.
| wilsonthewhale wrote:
| I tried a couple pages from SourceHut, which famously prides
| itself on its fast performance.
|
| The projects page (dynamic, changes with user activity on the
| site) https://sr.ht/projects: 100
|
| A patch page https://lists.sr.ht/~sircmpwn/sr.ht-
| dev/patches/23162: 99
|
| A source page
| https://git.sr.ht/~sircmpwn/pages.sr.ht/tree/master/item/ser...:
| 100
|
| SourceHut often "feels" a bit slower in that the browser does a
| full navigation whenever you click, but the absolute time is
| definitely low and I applaud them from building such a fast (and
| accessible!) experience.
| donohoe wrote:
| I don't take this as "Google doesn't care about UX" or
| performance as some comments suggest. Google is a large company
| and its not one unified team working on various projects in exact
| sync.
|
| That said, as Google will start promoting non-AMP content that
| passes Coe Web Page Vitals, its become a bigger deal.
|
| I work in media and CLS is a big problem. Most publishers don't
| come close to passing. As of writing only 5 out of the 80 I track
| score above 80! (Out of 100, and higher is better)
|
| The publication I run Product/Engineering for hovers around 84 to
| 85 and we don't have ads.
|
| Full list: https://webperf.xyz/
|
| To save you a click, the top 10 are: Rank Site
| Score Speed-Index FCP LCP Int TBT 1 ThoughtCo 87
| 1856 1.6 s 2.0 s 6.5 s 345 ms 2 The Markup 86
| 3621 2.4 s 3.5 s 3.5 s 95 ms 3 Rest of World 84
| 3154 1.9 s 4.0 s 4.3 s 79 ms 4 Investopedia 81
| 2009 1.6 s 1.9 s 6.5 s 552 ms 5 The Spruce 80
| 1877 1.3 s 1.9 s 6.7 s 634 ms
|
| Bottom 5 are... Rank Site Score Speed-Index
| FCP LCP Int TBT 77 CNN 11 31408
| 6.2 s 35.7 s 70.5 s 17,666 ms 78 Seattle PI 10 27902
| 6.4 s 14.6 s 57.1 s 11,687 ms 79 SFGate 9 41064
| 7.4 s 31.1 s 96.7 s 24,437 ms 80 NY Mag 8 18222
| 6.0 s 10.8 s 41.1 s 7,157 ms 81 Teen Vogue 8 22549
| 3.4 s 9.1 s 42.3 s 8,968 ms
|
| If you want to help me get our Score to 95 or higher, I am hiring
| frontend developers :)
|
| https://news.ycombinator.com/item?id=27358113
| tedd4u wrote:
| Surprised there's anything lower on that list than SFgate :)
| donohoe wrote:
| It's funny you mentioned that, I worked on SFGate experiment
| for optimizing ads and web performance in 2017.
|
| I got it down from 85s load time to 15s load time, and Speed
| Index score from 21,000 to 7,000. Ad revenue went up by 35%
| too (better UX as well)
|
| Then some jackass on business side signed a deal about a
| auto-play video player on every page and the project was
| killed
|
| https://docs.google.com/presentation/d/12ds0b4nTxzcDy23te0Zm.
| ..
|
| It is a bit sad to see where it is now. The potential was
| there.
| walshemj wrote:
| Even google has its own SEO teams and they are not immune to
| the developers messing up.
|
| And you need technical SEO's not more front end devs cranking
| out the frame work dejour
| VHRanger wrote:
| restofworld.org living up to their name and are accessible in
| the rest of the world
|
| Good on them
| donohoe wrote:
| Thats very nice of you to say. We're trying our best but have
| more work to do.
| butz wrote:
| While improving performance is a good goal, chasing after perfect
| scores in Google tests sometimes leads to increased complexity of
| code, bugs on older browsers and even bigger website size.
| 6510 wrote:
| Its just not your computer anymore - in countless ways.
| Processing user input should come before anything else, second is
| displaying what the user wants. To state the obvious: You should
| be able to click around in a menu (that doesn't jump all over the
| place) and switch between content before it is fully loaded.
| Almost nothing does this. You have to use things like <a> and
| <form>, features from back when it was your computer.
| cush wrote:
| I've noticed that the Cumulative Latout Shift test reports
| failures when scrolling in Monaco Editor and other VList
| implementations. Monaco has buttery smooth scrolling even in
| giant files.
| SuchAnonMuchWow wrote:
| For once, I think this is a good initiative from google.
|
| Most websites are over-bloated and this is a good incentive to
| move the web in the right direction.
| amelius wrote:
| Would you also think of it as a good initiative if the US
| government demanded the change?
|
| I don't know. To me it sounds a bit like "you can only be in
| the AppStore if your app meets these requirements".
|
| If Google was pure in its reasoning, they would allow slow
| pages to still be listed high in search results for people who
| don't care about page load times.
| threeseed wrote:
| > for people who don't care about page load times.
|
| And I am sure there is an incredibly high percentage of
| people who love nothing more than slow web sites.
| amelius wrote:
| If they are ad-free and tracking-free, then you can give me
| slow web sites.
| alserio wrote:
| For some strange reason ad-free and tracking-free sites
| are usually faster...
| [deleted]
| johnnypangs wrote:
| It seems like this is the case. There is a link to a more
| informative article in the Reddit post that has a
| spokesperson from google who says that if you're the most
| relevant, you're going to still be ranked well.
|
| Article link: https://www.ntara.com/google-core-web-vitals/
| amelius wrote:
| Ok, let's extend this. If two pages are equally relevant
| and equally fast, will they show me the page with the
| fewest advertisements first?
|
| And why is speed more important than number of
| advertisements?
| tsimionescu wrote:
| Fortunately, speed is almost always inversely
| proportional to number of ads! So the two mostly go hand
| in hand, for sites with similar content.
| 0-_-0 wrote:
| I think the best approach would be to show a small indicator
| of page load times in the search results so everyone can
| decide for themelves what they prefer. Then these preferences
| would influence what is shown on top, creating an
| optimisation feedback cycle.
| EqScr9a3 wrote:
| There is an enormous difference between a demand from a
| government that would be backed up by the threat of violence
| and anything Google is doing. You are free to ignore Google
| without any risk of being shot or imprisoned.
| gostsamo wrote:
| Yep, the only possibility is that your business might be
| destroyed, you might be bankrupt, and your family needs to
| start again from zero.
| Evidlo wrote:
| Is there somewhere I can paste my website URL to test it with Web
| Vitals?
| topicseed wrote:
| Sure, go ahead at
| https://developers.google.com/speed/pagespeed/insights/
| [deleted]
| fouronnes3 wrote:
| https://web.dev/measure/
| lucgommans wrote:
| I wonder where they measure from, as it claims >1 second of
| loading time for what is ~500ms in my browser.
|
| And my browser is also absurdly slow, now that I look at it.
| The HTML at https://lucgommans.nl is 978 bytes, the CSS 2KB,
| and the ping 39ms. With DNS+TCP+TLS+HTTP, the HTML downloads
| in 159ms. And then it waits. And waits. Then, after 134ms, it
| decides that DOM content has been loaded and fires the
| DOMContentLoaded event (there is no JS on the page). And then
| it waits some more, until after waiting 177ms it decides to
| _start_ downloading the CSS which finishes in 44ms. And then
| it waits again for 40ms, downloads the 33KB background image
| in 80ms, and finally clocks the total loading time at 508ms.
|
| How fast should it be? The loading times are 283ms altogether
| (sequentially), how long does parsing 3KB of data take on a
| modern CPU? Close to zero right?
|
| I remember in iirc ~2007-2015, quite regularly there would be
| news that browser X had optimized the DOM parser, or
| rewritten the JavaScript engine, or cut off some time with
| this or that. What even is going on here, did all this go out
| the window with the multi-process browser where it has to
| talk to other processes to get things done, or what's up with
| all this waiting? Does anyone know?
| walshemj wrote:
| Developer tools in chrome is the main one at scale use
| screaming frog.
| Ekaros wrote:
| I really don't understand how poorly it seems that some websites
| and browsers are doing even in environments that they have no
| right to have issues. Like sufficient or high amounts of
| everything from CPU, GPU, RAM and network...
| Vinnl wrote:
| I like Nolan Lawson's take [1] that this is a good thing, at
| least when it comes to the test itself: they're willing to push
| other teams at Google to improve (rather than lowering the bar
| for everyone else).
|
| I guess you could argue that they should've done that pushing
| internally, but that's really a concern only for Google itself.
| As long as it works, it's a win for users.
|
| [1] https://toot.cafe/@nolan/106358723424552836
| defanor wrote:
| Sad-yet-amusing fact: apparently most of the W3C member [0]
| websites can't pass the W3 HTML validator [1], for some years now
| (possibly it was the case for as long as both existed). With that
| in mind, failures to pass fairly complex tests for things
| conceived recently, which everyone isn't supposed to follow, are
| far from surprising.
|
| [0] https://www.w3.org/Consortium/membership.html
|
| [1] https://validator.w3.org/
| rasz wrote:
| Im actually surprised how good YT comes out in this. The page is
| a dumpster fire:
|
| Before Polymer (current YT framework) YT video page weighted
| somewhere around 50KB (10KB compressed) and was ordinary HTML +
| 1MB js player (400KB compressed). As soon as HTML part loaded the
| page was all there.
|
| Now its 600KB (100KB compressed) JSON + additional 8MB
| desktop_polymer.js (1MB compressed) that needs to be
| compiled/interpreted before it even starts building the page
| client side and anything starts showing up. 1MB js player is on
| top of that.
| [deleted]
| kevingadd wrote:
| It's astonishing how much worse they made it - and
| intentionally made it worse in browsers without Web Components,
| like Firefox. Forcing the old non-Polymer website was like
| hitting the turbo button on an old PC.
| easrng wrote:
| It wasn't Web Components (Firefox supports those[0]) it was
| Shadow DOM v0, the original deprecated Chrome-only
| predecessor to Web Components. Except it has been removed
| from Chrome now so I don't think this is an issue anymore.
|
| [0]: https://caniuse.com/custom-elementsv1
| zelphirkalt wrote:
| "Do not trust any statistics you did not fake yourself."
| silvestrov wrote:
| YouTube even fakes FCP by initially showing a lot of dummy grey
| boxes until they have any idea of what to really show.
| gherkinnn wrote:
| These so-called skeleton screens are a common technique and
| not inherently bad.
|
| When you know an elements exact dimensions but not its
| contents and have more important things to serve first, it's
| completely valid to use.
|
| It gets infuriating when these grey boxes animate about but
| them decide not to display anything anyway and just collapse.
| Or they load and load and load and not account for network
| errors. Or when the box sizes has nothing to do with the size
| of the element it's being a placeholder for.
| alpaca128 wrote:
| > These so-called skeleton screens are a common technique
| and not inherently bad.
|
| The technique may not be bad by itself, but it's so common
| mostly among super bloated website behemoths that every
| time I see those skeleton screens I automatically prepare
| myself for another 5 second of loading.
|
| It's the modern web equivalent of seeing an hourglass
| cursor on an underpowered Vista machine - not inherently
| bad but usually a bad omen.
| gherkinnn wrote:
| Can't argue with that.
|
| It's easier to randomly place grey boxes around a page
| than to address the real problems. Plus you get to
| lecture people with terms like "perceived performance".
| swiley wrote:
| Did we forget wombocom so quickly? A big part of the joke
| was that loading screens (of any kind) on sites are really
| stupid.
| tedd4u wrote:
| Link in case anyone doesn't know what this is referring
| to (I think)
|
| https://www.zombo.com/
|
| (Appears to have been upgraded to HTTPS and other
| modernities!)
| mrweasel wrote:
| It truly is a terribly site, in terms of performance. Streaming
| HD content used to be the "hard problem", but now it's
| displaying the site hosting the videos that the issue.
|
| You can stream HD content on the lowest end device, or hardware
| almost 10 years old. The same hardware just isn't powerful
| enough to let you use the YouTube website in a performant way.
| I cannot fathom how YouTube doesn't see that as a problem.
| MattGaiser wrote:
| How many people, especially among the best advertising
| demographics, have 10 year old hardware? I have had 6
| computers in that timeframe and 4 phones.
| [deleted]
| jonpurdy wrote:
| Running a 2012 Mac Mini as a HTPC. It's been wonderful for
| anything h.264 and has no problems even playing 4K files
| (h.264, it chokes on 265 of course).
|
| But YouTube is increasingly becoming unusable to the point
| where I just youtube-dl what I want to watch in advance.
| Philip-J-Fry wrote:
| I'm amazed at how much slower Youtube has gotten in the past
| couple of years. That fake paint stuff is terrible too.
|
| Here's something to try, resize youtube horizontally and watch
| your browser grind to a halt. At least in the case of Chrome
| for me.
| thejosh wrote:
| Well that would be why it's so much slower than it use to be!
| HN is one of the last bastions of fast sites :(
| SimianLogic2 wrote:
| I've been doing a lot of this work over the last month and LCP
| has been the hardest metric to move by far. I ended up dropping
| AdSense on mobile entirely from my site, which was ironically the
| biggest factor in performance degradation across a number of the
| stats they track.
| sa3dany wrote:
| Does it have to load early though, if possible you can load it
| on demand after the page has fully loaded.
| SimianLogic2 wrote:
| My site's not huge and mobile ads were only ~$150/mo. I might
| revisit it later, but for now I'm willing to take the path of
| least resistance.
| secondcoming wrote:
| As a daily user of Google's Cloud Platform web UI, this doesn't
| surprise me in the least!
| cube00 wrote:
| I was willing to cut some slack because I'm sure some
| operations actually take time on the back end (eg. create a VM)
| but even browsing read only pages make my laptop fan spin up
| like no other web site does.
| secondcoming wrote:
| I'm sure thety do, but when doing something like removing an
| instance from an Instance Group you can never be 100% certain
| that it actually was removed without several refreshes.
|
| Switching to a different project can give you, for example, a
| list on instances belonging to the previous project.
|
| It's just not what I'd expect from Google considering what
| they pay engineers.
| alpaca128 wrote:
| > It's causing a fair amount of panic, because 96% of sites fail
| the test.
|
| Good. That sounds like a realistic estimation of the number of
| slow and bloated websites. What good are nice animations and
| designs when they destroy the UX?
|
| You'll never see someone gaming in 4k with hardware that can't
| render it with more than 15FPS. Yet we see that kind of
| "tradeoff" every time we browse the web. Users get loading times
| for the site itself, then processing of JS to redundantly do the
| browser's job of arranging the 20 layers of <div>s, then loading
| animations for the actual contents, and then a couple seconds
| after that you might get the first thumbnails.
|
| And I'm absolutely not surprised Google's pages fail this test as
| well; Everything from Google Images to YouTube got increasingly
| worse with every new design iteration, with both slower loading
| times as well as an increase in breakage.
| eric__cartman wrote:
| The amount of bloat in modern websites always amazed me. I
| remember the first computer I ever had, a hand me down from my
| parents with 512MB of ram and a single core 1.6GHz cpu (yes I'm
| a zoomer and wasn't even born in the good ole days of dialup
| internet and Windows 95) and all websites I visited ran just
| fine. I could open many browser tabs and do all the things one
| normally does in a website. The only main difference maybe is
| that video playback nowadays is done at much higher resolutions
| and bitrate. And web apps were a very new (or maybe even non-
| existent) concept. But still, nowadays I see my web browser
| using 1GB+ of memory with a few tabs open containing some
| newspaper articles and perhaps a couple other misc non media
| heavy websites.
|
| This is madness. When not using an ad blocker, the amount of
| data that a regular website loads that's not relevant for what
| you need to read (so no text and images included) is huge. I
| can understand why some complex web apps like Google Docs or
| whatever the cloud version of MS Office is called may be quite
| more resource intensive than a magazine article, but there is
| no reason why a newspaper or cooking recipe site should use
| memory in the hundreds of megabytes, when the useful content
| itself that the reader cares about is maybe (with images
| included) a couple megabytes in total.
| PaulDavisThe1st wrote:
| > When not using an ad blocker
|
| why would any sane person ever do this?
|
| oh right, mobile.
|
| so let me ask a different way: why would any sane person ever
| browse the web on any platform that does not have effective
| ad blocking?
| lupire wrote:
| Mobile.
| meowster wrote:
| Friends don't let friends use anything but FireFox with
| uBlock Origin, on Android... Sorry iPhone users :-/
| prophesi wrote:
| On iPhone, I use Wipr which handles blocking the majority
| of ads.
| artificial wrote:
| Also an option is NextDNS, Raspberry Pi without the
| hassle. A combination of 1Blocker and NextDNS works
| pretty decent.
| eric__cartman wrote:
| My mom's first response after I deployed AdGuard Home in
| her network was "I don't know what you did but my phone
| feels faster" lol
| throwaway3699 wrote:
| I don't mind ads, only have an issue with performance and
| tracking. My browsing habits are such that very
| occasionally I stumble upon a website full of ads and it's
| startling.
| dale_glass wrote:
| The memory requirements for graphics changed dramatically.
|
| A screen at 1024x768, 16 bit color is 1.5 MB.
|
| A screen at 3840x2160, 24 bit color is 24 MB. 32 MB if using
| 32 bit color.
|
| Add to it that graphics became more plentiful with increased
| bandwidth, and low color GIFs are out of fashion, and you
| very easily see the memory usage grow by several times just
| from that fact alone.
|
| Older operating systems also didn't have a compositor. They
| told the application: "this part of your window has just been
| damaged by the user dragging another window over it, redraw
| it".
|
| Modern operating systems use a compositor. Every window is
| rendered in memory then composed as needed. This makes for a
| much nicer experience, but the memory cost of that is quite
| significant.
|
| Take a webpage, and just give the mouse wheel a good spin. It
| should render lightning fast, which probably means the
| browser has a good chunk if not all of the page pre-rendered
| and ready to put on the screen in a few milliseconds. This
| also is going to take a lot of memory.
| remram wrote:
| 32 MB is nothing. Unless you have hundreds of open windows
| this does not account for gigabytes of memory usage from
| the browser.
| dale_glass wrote:
| Point is, as resolutions and color depth increased, the
| amount of memory needed for graphics grew by several
| times. So a switch from 512MB being enough to several GB
| being needed is almost unavoidable on account of that
| alone.
| throwaway3699 wrote:
| That all happens on the GPU. Do task managers show memory
| for both?
| dale_glass wrote:
| What do you mean, "on the GPU"? Where do you think the
| GPU gets the textures?
|
| I'm not familiar with DirectX, but OpenGL manages
| textures invisibly behind the developer's back, to the
| point that it's difficult for a developer to find out how
| much VRAM there is. OpenGL wants to invisibly manage VRAM
| on its own, which means that every texture you have
| exists at least twice: Once in RAM, and once in VRAM. And
| very possibly 3 times, if the application keeps the
| original texture data in its own buffer for a bunch of
| reasons.
|
| So when you look at google.com, that Google logo probably
| exists in memory at least 3 times: probably as a RGB
| bitmap (the actual image object the browser works with),
| in RAM managed by the graphics driver (in whatever format
| the graphics card likes best), and then on the card's
| VRAM possibly. It could be more, like if the browser can
| apply some sort of color correction or other
| transformation and therefore keeps both the original and
| retouched version. The original PNG is also probably in
| the cache, and there exists the possibility of extra
| copies because some particular part of the system needs
| images to be in some specific format. Graphics are memory
| hungry, and it adds up fast.
|
| The nice thing about this is that your GUI doesn't get a
| whole bunch of horrible artifacts if you hook up a 4K
| monitor to a laptop that allocates 128MB VRAM to
| graphics. The 3D rendering layer simply makes do with
| what VRAM there is by constantly copying stuff from RAM
| as needed, with the applications not noticing anything.
|
| The bad thing is that this convenience has a cost in RAM.
| But really, for the better. Can you imagine the pain it
| would be to program a GUI if every single application had
| to allocate VRAM for itself and could break the system by
| exhausting it?
| john-doe wrote:
| An empty Google Docs document (read-only) is 6.5MB, making
| 187 requests...
| [deleted]
| jyounker wrote:
| > a hand me down from my parents with 512MB of ram and a
| single core 1.6GHz cpu
|
| Wow. I'm feeling old. My first computer ran at 1MHz with 16K
| of memory. :)
| dundarious wrote:
| A valid criticism is that Google's own products will presumably
| not be penalized in search results for these low scores.
|
| I'm happy to be wrong on that assumption, of course, but I think
| it's a reasonable one to make, and it severely dampens my
| willingness to agree that Google deserves praise for setting a
| high bar that it must itself struggle to reach in order to
| provide real value to users.
| shireboy wrote:
| I've struggled with this. Lighthouse is a great tool but the
| things it dings me most on: google ads, google fonts, and google
| maps.
| est wrote:
| Alternatively one department's product conflicts with another
| department's tool.
| nailer wrote:
| Direct link: https://www.ntara.com/google-core-web-vitals/
|
| Dang might want to fix it if he reads this.
| amelius wrote:
| This limit of 4 seconds, is that on some specific hardware? Or
| does Google make this relative to the user's hardware? (I.e., the
| user who is doing the search)
| SimianLogic2 wrote:
| The desktop performance targets are easy to hit, but mobile
| tests for Lighthouse are rough: "Simulated Fast 3G" network
| throttling and a 4x CPU slowdown.
|
| I don't know anyone in the US who is still on 3G and modern
| CPUs are not 4x slower than their desktop counterparts.
| di4na wrote:
| Guess what, Google does not sell mostly to the US!
|
| Also modern CPUs on mobile are faaaaaar slower than 4x the
| desktop one for the majority of the world.
| buu700 wrote:
| If we broaden "desktop" to include laptops, and assuming
| we're talking about common/mid-range consumer devices
| rather than e.g. spec'd out gaming machines, GP's point
| seems to hold up.
|
| It's still wild to me that the 2013 MacBook Pro that was my
| daily driver until recently is neck-and-neck on Geekbench
| with both my Pixel 5 (whose CPU is considered mid-range)
| and the old iPhone 7 that I use as a test device. It's
| decisively slower than every iPhone since version 8.
|
| If we move ahead to modern desktops: it looks like iPhones
| have typically been only 20 - 25% slower than iPad Pros
| released in the same years, and this year's iPad Pro
| literally has a desktop processor in it (not even just a
| laptop processor, now that the iMac uses M1 too).
|
| Based on that, in order for your claim to be true, the
| majority of the world outside the US would have to be using
| outdated or very low-end mobile devices and/or modern
| souped-up desktops that blow the average American's machine
| out of the water.
|
| Some googling shows that a popular phone in India is the
| Redmi 8, which is pretty low-end even compared to the Pixel
| 5, and scores about half the 2013 MBP at multi-core and
| slightly above 25% at single-core. If the average owner of
| a phone like this also happened to own a modern (not
| 8-year-old) mid-range consumer laptop, I could see 4x being
| overly optimistic.
| SimianLogic2 wrote:
| A) This level of vitriol directed at internet strangers is
| not super healthy. I hope you find something that helps you
| chill out a bit.
|
| B) I didn't say they did, but not every product needs to be
| concerned with the rest of the world. Websites in English
| targeted at a US market and charging US Dollars for their
| products probably don't care much about the average mobile
| processor speed in India (anecdotal source: me).
|
| I would guess that most SaaS businesses are primarily
| accessed on desktop computers during the work week, but I
| bet they're now collectively spending millions-if-not-
| billions of dollars in dev time to make their landing pages
| load faster on Indian mobile phones for users who are
| unlikely to ever visit or become customers.
|
| (I pick on India because my site gets a lot of Indian
| traffic, but feel free to swap in the developing nation of
| your choice.)
| Dylan16807 wrote:
| I wouldn't read "guess what" nearly as harsh as you're
| taking it.
| deeblering4 wrote:
| This is also the case if your site has doubleclick ads or
| similar, they are the slowest part of the page by a significant
| amount.
|
| In my experience a site got a lighthouse score of ~50 with
| doubleclick ads enabled, and a score of 100 with my network ad
| blocker enabled.
|
| Truly infuriating that they penalize you for using their own
| solutions. And of course G has no support to speak of to help
| report or remedy the problem.
| uses wrote:
| One of the frustrating things about being a web developer is that
| I can do a ton of advanced optimization to the point of code
| splitting my js with dynamic import() or doing page level
| caching, lazyloading, and splitting non-critical CSS. But other
| departments of my org can squash it by adding a chat plugin with
| a disgusting level of resource abuse which obliterates all the
| advances I've made.
|
| My approach is to just explain what I can or can't do, and
| explain the trade-offs for everything. I'll give the decision-
| makers the info but I'm not going to be the curmudgeon going to
| war over something the marketing people say we need to survive.
| kmeisthax wrote:
| DNS resolution 12ms
|
| First-byte time 64ms
|
| CSS & JavaScript 148ms
|
| Analytics 3,157ms
|
| Images 250ms
|
| someone who is good at the web please help me budget this. my
| store is slow
| cocoafleck wrote:
| Well clearly we can't remove analytics, and if we changed
| analytics to a different (faster) provider someone would have
| a cow... We could lower our image sizes, but let's be honest
| that would result in the dreaded JPGing. Our Javascript is a
| hand crafted wonder that we just rebuilt for the 6th time in
| the past decade in a new modern better framework so clearly
| that cannot be the problem. Therefore I'm leaning towards
| adding a CDN to lower the first byte time, or yelling at our
| DNS provider to speed up their side. /s
| offsky wrote:
| I wish google would optimize Adsense for Core Vitals. My site
| gets a 100 score without ads and a score of 80 with Adsense. I'm
| not willing to give up the ad revenue.
___________________________________________________________________
(page generated 2021-06-05 23:01 UTC)