[HN Gopher] Servo, the parallel browser engine written in Rust
___________________________________________________________________
Servo, the parallel browser engine written in Rust
Author : manx
Score : 302 points
Date : 2023-05-27 07:50 UTC (15 hours ago)
(HTM) web link (servo.org)
(TXT) w3m dump (servo.org)
| smarkov wrote:
| Without downplaying all the effort being put into this, I think
| we're just digging a deeper hole.
|
| Websites do more or less the same thing they did 15 years ago but
| they are now 20 times more complex to develop and maintain. A
| good amount of developers would rather donate a kidney than write
| CSS. Accessibility is a hack. Performance is getting worse
| despite us having better than ever hardware. We're spending large
| amounts of time reinventing the wheel. JS is a horrible, bare
| bones language which is why you can't get anything done without
| 100 packages.
|
| It's time to move away from HTML/CSS/JS. They worked great for as
| long as they did but instead of further contributing to the mess
| that they've become, we should be looking into alternatives.
| franklampard wrote:
| > A good amount of developers would rather donate a kidney than
| write CSS. Accessibility is a hack.
|
| Name one :)
| satvikpendem wrote:
| People who write Flutter, for one, including me to some
| extent.
| ohgodplsno wrote:
| Depends. What browsers do I have to support ? The amount of
| kidneys I donate is directly proportional to how much I have
| to target IE6, at which point I donate both and would rather
| die.
| stjohnswarts wrote:
| HTML/CSS are fine, as is javascript, but yeah I don't
| understand why web pages get bigger and bigger and bigger and
| slower. Just because we have more bandwidth, I don't know why
| we have to chew through with ever more complicated webpages. I
| get stuff like slack or gmail, but showing a company webpage?
| Why does that take 10 seconds to download and render?
| Tagbert wrote:
| Ads and all of the frameworks that each ad network include.
| LeFantome wrote:
| Because users want that? As a result, sites that deliver that
| deliver more value to their owners than sites that don't?
|
| I do not subscribe to the view that it is all just
| collectively worse because we are cooperatively moving things
| away from our preferences.
|
| I use ancient hardware. I wish the web was much lighter. That
| said, I do not see the fact that it ( and everything in
| computing ) keep becomes more and more resource intensive as
| evidence if some kind of incompetence or collective failure.
|
| It just means I represent too little demand to dictate the
| supply.
| forgotmypw17 wrote:
| The beauty of the Web stack is that all of what you are
| describing is optional.
|
| You can write simple, accessible, performant websites, which
| use JS as a bonus and have all the basic features without. And
| as a bonus, it works across all the browsers, not just
| Chrome/Firefox. As a bonus, it works for all the accessibility
| scenarios, not just the standard ones in the test suites. And
| as a huge bonus, it's much easier to maintain and less fragile.
|
| I'm personally on the extreme end, what with trying to support
| not just today's browsers but also the retro mainstream such as
| Netscape and IE, but you don't have to go that far to have an
| enjoyable experience with this platform.
| lapinot wrote:
| > The beauty of the Web stack is that all of what you are
| describing is optional.
|
| The problem is not that you cannot write lightweight
| accessible and beautiful websites, you can. The problem is
| that this is now all part of the "default web browser" and so
| many people are taking advantage of its widespread
| availability to use it without a good reason to. Most
| commonly fingerprinting, tracking, following corporate dev
| fads, etc. And because of that it is now borderline
| impossible to have a decent experience using a lightweight
| browser because so many websites make crucial use of stuff
| they should really not. It's possible but it takes time and
| know-how, personally i don't find it fun enough to invest
| more time than crafting my ublock whitelists. And this is
| already more than most people (even people that would be able
| to do it).
| forgotmypw17 wrote:
| I don't have a strategy for fixing the entire ecosystem,
| but for my little corner of the world, I guess I've
| constructed a mental venn diagram where I only browse at
| the overlap of "content I'm interested in" and "sites which
| are accessible to me", and leave the rest of it alone.
| There is more than enough for me to browse in that space,
| especially combined with my own websites, that I rarely
| even think about the horrors you mention.
|
| One of the coolest things about it is that I have noticed
| over time that obnoxious frontend correlates very strongly
| with crappy content, so the average quality of what I read
| and watch has improved drastically. I try to practice a
| "mental diet", and it has helped tremendously with that.
|
| In some ways, it's not unlike IRL, where there are places I
| would rather not be, and they have certain tells, and I'm
| OK with them being there, I just don't go inside if I can
| help it, and I leave as quickly as possible if I do.
|
| I think the Web is still very young, and now that we have
| can have ML-assisted markup generation, accessibility will
| be coming around, just like wheelchair ramps became the
| norm. Pretty soon, we won't be dismissing a 0.01% browser
| as not worth supporting, because it will be so much easier
| to just tell the server, "please remove JS and all but the
| minimal markup from your pages".
|
| If you want to see a PoC of what this may look like, here
| is a quick demo video. The NoJS bit is at about 2:30.
| https://vimeo.com/828698165
| capableweb wrote:
| > Without downplaying all the effort being put into this
|
| > Websites do more or less the same thing they did 15 years ago
|
| That is a huge downplay if anything :) Websites certainly does
| the same they always done, the difference nowadays is that
| there are web apps too, interactive applications, which existed
| 15 years ago sure, but they didn't have the same scale as
| today.
|
| But, most web apps today should just be websites, but there are
| some use cases for web apps too, although they are few.
|
| Once you start to understand there is a difference between
| websites and web apps, things start to make sense. There is
| still a huge misuse of making things into web apps, but at
| least you start to understand why the ecosystem moves in a
| direction you seemingly don't grasp.
|
| > It's time to move away from HTML/CSS/JS. They worked great
| for as long as they did but instead of further contributing to
| the mess that they've become, we should be looking into
| alternatives.
|
| There is huge efforts into this already, which you also seem to
| have missed. The whole WASM effort is about being able to write
| code for browsers in any language you want, and it's already
| usable today. It's missing some vital things like DOM
| manipulation to be 100x more useful, but again, still useful
| today. Lots of games are written in Rust and deployed as WASM
| for example, runnable in the browser.
| kome wrote:
| we should go back to hand written html and css, and php when
| needed. i'm not even ironic.
| twobitshifter wrote:
| Websites do the same thing as 15 years ago? Not true, websites
| can now exceed desktop apps. Was their figma 15 years ago?
|
| 20 times more complex - this comes with doing more. 15 years
| ago there was no CI/CD.
|
| Write CSS - don't have to given Sass, components, and
| bootstrap.
|
| Performance is worse - look at v8 benchmarks, look at webgl and
| Wasm. The browser itself can run much faster today, and people
| are doing more with it.
|
| JS barebones - just one package. https://github.com/stdlib-
| js/stdlib. - and the node ecosystem is a feature not a bug.
|
| Look elsewhere - every other UI framework we've tried before
| has been worse in terms of compatibility, functionality,
| flexibility, and available prebuilt tooling.
| lapinot wrote:
| Nothing you say is wrong per se. But what GP is saying i
| believe is that most website _should not need_ these features
| to provide value to users. And indeed most website useful to
| me are still mildly interactive documents. The problem is
| that web browsers, standards, ecosystems inflated to cater
| these few webapps needing advanced control over the machine.
| In a sense web is just the new java: a new environment said
| to be "cross-plateform" where it is in fact just a new
| plateform that got its vm ubiquitous.
|
| The web-as-vm has nothing wrong, but it has eaten the web-as-
| interactive-documents. And now to have a lightweight web
| experience not focused on webapps i am stuck with the
| heavyweight runtimes with _even more stuff on top_ just to
| disable features, lock down invasive websites that grabbed
| the newly available features to implement invasive anti-
| features, etc.
| speed_spread wrote:
| In the late 90's we had native apps with UIs more complex
| than Figma that ran fine with 1/100 the CPU, RAM and storage
| we have now. The online rush buried a whole class of
| development tools that has to be painfully reinvented over
| the next 20 years and are still bogged down by the incidental
| complexity of using the web as an application platform.
| ptx wrote:
| One example of this, I think, is that modern UIs are
| seemingly no longer able to display large lists. The native
| win32 API had virtualized list views 27 years ago, since
| Windows 95 OSR2 [1][2].
|
| [1] https://learn.microsoft.com/en-
| us/windows/win32/controls/lis...
|
| [2] https://www.geoffchappell.com/studies/windows/shell/com
| ctl32...
| aranchelk wrote:
| IIRC they were: * Often Windows-only. * Infrequently
| updated, with bugs bad enough to crash the app, sometimes
| the system * Dependent on the user to install new versions
| * Able to access everything on your computer * Non-
| collaborative * Dependent on you for data backups
|
| These are non-trivial things.
|
| Now that I'm thinking about it, I still don't have
| convenient sandboxing of desktop apps.
| speed_spread wrote:
| We have online package managers, garbage collected
| languages, containers, unit testing, memory protection,
| remote backups, cross platform UI toolkits - None of the
| issues you mention required having to serve everything
| through a browser window set in motion by a subpar
| language written over a weekend with cosmetic
| considerations as first design principle.
|
| BTW you can have easy app sandboxing today using flatpak,
| works like a charm on Fedora.
| ptx wrote:
| Is the Flatpak sandboxing actually secure, though? Or
| does it work like a charm because most of the security
| enforcement is disabled in practice?
|
| Allegedly [1] a lot of popular packages use "--
| filesystem=host", which completely defeats the security
| of sandbox by granting access to the user's home
| directory (i.e, allows arbitrary code execution through
| modification of configuration files).
|
| I think I would rather trust the browser's sandbox, where
| sandboxing has been in place from the start and
| applications are designed for it.
|
| [1] https://flatkill.org/2020/
| satvikpendem wrote:
| Indeed, this is one reason why Ian Hickson (who helped create
| the HTML5 spec) is advocating for moving towards an application
| platform based on WASM instead. He has some good comments on
| this thread.
|
| https://news.ycombinator.com/item?id=34612696
| jupp0r wrote:
| It always boggles my mind how people on HN keep claiming that
| websites are slow because of JS when every 10th headline is
| about raytracing in WebGPU at 60fps or running LLM inference
| locally in the browser or similar.
|
| Let me open your mind up to the possibility that no language
| can prevent unperformant applications being written in it and
| that the vast majority of web apps don't treat performance as
| their primary concern as long as it's good enough for their
| particular definition of "good enough".
|
| If you are so motivated, creating high performant web apps is
| easy. The tooling to make improvements is among the best for
| all programming languages.
| zelphirkalt wrote:
| While I agree with websites being 20 times more complex, I
| don't agree with:
|
| > A good amount of developers would rather donate a kidney than
| write CSS.
|
| If I was doing web development, I would most definitely use
| modern CSS to make responsive websites. I would prefer it and
| use it in almost all cases over using JS.
|
| > Accessibility is a hack.
|
| If one uses HTML semantically, and does not resort to hacks, it
| actually offers a lot of accessibility. More than most "modern"
| JS-only websites.
|
| > Performance is getting worse despite us having better than
| ever hardware. We're spending large amounts of time reinventing
| the wheel. JS is a horrible, bare bones language which is why
| you can't get anything done without 100 packages.
|
| Yes. Most of that performance loss is due to JS bloat (need to
| download that framework first), ads, and unwanted tracking.
|
| JS itself has actually improved, at least the APIs for events,
| DOM access, AJAX, and probably more. What has not improved is
| the mindset and development practices of developers and
| companies. Probably 90% of the websites using big frameworks
| would not need any of it and could live on server side rendered
| templates like we had more than a decade ago already. Sprinkle
| in some interactive components only on pages that need them and
| most of a website's pages would not be affected by that at all.
| Done.
|
| Sometimes we should take a step back and really ask ourselves
| what kind of website we are building. What is the character of
| that website? Is it merely a website that shows some
| information about a company? It can probably live on server
| side rendered templates. A blog? Same. Not every website needs
| to be a "web app". Most of them actually do not.
|
| > It's time to move away from HTML/CSS/JS. They worked great
| for as long as they did but instead of further contributing to
| the mess that they've become, we should be looking into
| alternatives.
|
| I don't agree. It is time to embrace standard and good usage of
| HTML and CSS and eschew heavy JS frameworks and JS on the
| server as much as possible. I would not put HTML/CSS and JS
| into one basket here to throw out things. HTML and CSS have
| made great progress. JS too, but as I said, the mindset and
| practices of the ecosystem and people are not there.
| HollowEyes wrote:
| Agree a load of old crap that could have been achieved in the
| most part with a better implementation of html frames. Sames
| sucky problems like publishing barriers, Devs holding sites
| to ransom. Etc. Etc.
|
| A good open source app for a shop, a church, a school would
| be a better focus of a million monkeys on MacBooks.
| SanderNL wrote:
| I actually think you have a point. Sure, it's missing nuance
| and there are certainly things 2000s tech will not do properly,
| but to be honest I'm not quite hundred percent sure what the
| past two decades brought other than funky frameworks and
| libraries. Useful, but funky. I know for example CSS animations
| and layout possibilities are off the charts nowadays but they
| do not strike me as fundamental improvements although the QoL
| is certainly appreciated.
|
| What happened last two decades is browsers getting their act
| together though, that made a huge difference.
|
| I'm a webdev. I work with React and Angular. Again, missing
| nuance but I'm not completely sure we are on the right track at
| all. Not sure what we should do.
|
| I made a comparison between crystal meth and wordpress before.
| I'm not sure getting us all on the web (crystal meth) train is
| the way to go. Sure it's standard and lots of people like it.
| It's cheap.. but yeah. The alternative might be painful (aka
| not do it).
| FpUser wrote:
| >"JS is a horrible, bare bones language which is why you can't
| get anything done without 100 packages."
|
| BS. I use JS for front ends without any frameworks Just some of
| domain specific libs. Works fine and I "get the shit done".
| Sure I prefer compiled type languages like C++ but code in
| anything if it makes the the project completed faster given
| whatever constraints.
| dathinab wrote:
| > Websites do more or less the same thing they did 15 years ago
|
| no not at all
|
| and the browser internally diverge far far more then that in
| their handling of things
| marcosdumay wrote:
| What do you mean? HTML is just a hierarchical document format,
| and CSS is just a logical document styling language. Those two
| are about as simple as they can be.
|
| Accessibility is a very hard and complex thing where half of
| its outspoken "promoters" fight against every gain because they
| have money invested on the status-quo. That's why it doesn't
| improve fast.
|
| Most developers avoid CSS exactly like most developers avoid
| SQL, to avoid logical programing. That doesn't mean that
| logical programing isn't the best known paradigms for data
| querying and document styling, it just means that most
| developers are bad and will avoid learning something as hard as
| they can. Honestly, I have no idea how to fix this, but
| conceding our tools to the preferences of the worst of us isn't
| going to lead to anything good.
|
| JS indeed can be improved. But guess what, many people are
| working hard to replace it.
| kitkat_new wrote:
| > Most developers avoid CSS exactly like most developers
| avoid SQL, to avoid logical programing.
|
| I avoid CSS and don't avoid SQL nearly as much, because the
| latter is logical, simple and easy to learn.
| LeFantome wrote:
| It has been educational to watch the SerenityOS / Ladybird team
| implement browser features. As they make real websites work,
| you can see how the technologies for even simple brochure sites
| have changed. For example, logos may be SVG instead of simple
| images which makes sense given the range of screen sizes and
| resolutions that need to be supported.
|
| You can also see that CSS continues to grow but in pragmatic
| ways that actually reduce the amount of JavaScript to do the
| kinds of things that users expect these days.
|
| On the performance and capability front front, WASM is a game
| changer.
|
| Regardless of what JS framework is in vogue, I do not think it
| is fair to say that we are doing the same thing on the web as
| we always have or even to say that the base technologies are
| getting more complex for no reason. At least, it is no more
| fair than any other programming domain. I mean, we can look at
| Excel and VisiCalc and say that we are creating the same apps
| that we always have with more complexity and bloat. I mean,
| people do say that but there is an awful lot left out of that
| analysis. Even more so if I compare VisiCalc and Excel via
| Office365 in a browser.
|
| Regular people are routinely accomplishing far more with their
| computers than they have in the past. More and more of that is
| being done in web browsers.
| smarkov wrote:
| > Regardless of what JS framework is in vogue, I do not think
| it is fair to say that we are doing the same thing on the web
| as we always have or even to say that the base technologies
| are getting more complex for no reason.
|
| Certainly. The things you _can_ do on the web have vastly
| increased, but the things we actually _are_ doing are mostly
| the same. Read some text, make an account, login, submit a
| form, make a payment, upload a file. That 's what the vast
| majority of the web has been and still is, yet the path you
| take to get there is many times more complex and the user
| experience hasn't proportionally gotten that much better.
| Arguably, it's gotten worse.
| LeFantome wrote:
| Well, I think you can break it into two parts. I see the
| underlying tech getting better, making it easier, and
| making it faster. Look at the HTML dialogue element as an
| example.
|
| So, "the path you take to get there" CAN be much improved
| as a developer.
|
| Now, as a user, you may find that sites that could be
| simpler are much heavier and complex than you want them to
| be. My question is why? My argument is that both users and
| producers of these sites WANT them to be more complex.
| Which means blaming the technology is misplaced.
|
| Certainly we CAN still make the sites that ran in Netscape
| 4. Why don't we?
| o1y32 wrote:
| The second paragraph is just simply plain wrong throughout. You
| should actually learn the web stack and look at how things work
| in 2023 instead of reading other people's rants or secondhand
| opinions to form your own opinions. CSS is not hard.
| Performance talking is meaningless without benchmarks. No, you
| can definitely build a decent small website with a few or even
| no package dependencies. Maybe do a reality check first.
|
| > It's time to move away from HTML/CSS/JS
|
| Just meaningless empty talks over and over on HN. People who
| don't understand web stacks ranting about web without knowing
| the complexity and nuances. What alternatives do you have? Are
| they going to provide enough features and customization as the
| status quo? Will this new thing achieve at least 80% of
| development speed? Many companies adopted web stack for their
| application because of the flexibility and time to deliver.
| They make products, instead of being a purist
| lvass wrote:
| >What alternatives do you have?
|
| Gemini for pages and if you really need an app, wasm and
| webgl/webgpu. HTML/CSS/JS can definitely be replaced by
| better, already existing things, but browsers will still need
| to understand those for many years going forward, even if
| everyone suddenly stopped developing with them.
| FireInsight wrote:
| WebGPU/GL can't replace HTML on the web, even in webapps,
| because it has zero accessibility or semantics
| lvass wrote:
| Webapps don't really have semantics. Accessibility is the
| toolkit's job, leaving that to the web platform
| invariably generates half baked results in an app.
| smarkov wrote:
| > You should actually learn the web stack and look at how
| things work in 2023 instead of reading other people's rants
| or secondhand opinions to form your own opinions.
|
| I didn't base my response on anybody else's words. I've been
| around the web since the good old days where jQuery was the
| norm and everything had a phpBB forum. It's coming from my
| own experience and observations.
|
| > Will this new thing achieve at least 80% of development
| speed?
|
| You're so concerned with development speed, yet you're
| rewriting the same thing in a new framework every 2 years.
| refulgentis wrote:
| OPs right, and it's sort of immature to pretend its all
| broken and say wildly false things like "you're rewriting
| the same thing in a new framework every 2 years." Who is?
| That sounds like a management problem, not a...I don't even
| know, language problem? Package manager problem?
| ori_b wrote:
| Stack maturity problem?
| TheCoreh wrote:
| > You're so concerned with development speed, yet you're
| rewriting the same thing in a new framework every 2 years.
|
| I think that meme gets thrown around a lot, and it used to
| be true, but the framework churn of JS has largely stopped
| at this point. A lot of us (me included) have been just
| using the same stack for several years now.
|
| Sure, React has evolved since 2015/2016 (e.g. the move to
| function components and hooks) but the old code still runs,
| with minimal to no changes. Patterns and best practices
| have largely been established. You can entirely avoid the
| bleeding edge stuff and still be productive.
| nirvdrum wrote:
| Elsewhere in this thread there are comments about needing
| to continue pace to move beyond React. I don't think
| we've seen the end of the churn. I agree that React is a
| pretty stable base these days and you can just work with
| that, but it's gotten long in the tooth in places so new
| frameworks are still sprouting up.
| BSEdlMMldESB wrote:
| > good old days where jQuery ...
|
| do you remember IE6? that's before jQuery, the time of
| ActiveX and java applets
|
| you're right that it got better by the time of jQuery...
|
| when I started on this (highschool time for me), we was
| writting HTML by hand. PHP 3 was used in production; PHP 4
| was brand spanking new. CGI was a thing, imagine that! C
| code used to dynamically make HTML stuff, exposed on the
| web! (through C? gateway interface)
| slater wrote:
| *Common gateway interface
| ttfkam wrote:
| I've been around the web since document.write() and <font
| color=red> were cutting edge.
|
| You're wrong.
|
| There is no way, shape, or form where 2008 web technology
| is comparable to today. (IE6!!!) Not in styling. Not in
| consistency. Not in performance. Not in accessibility. Not
| in security. And certainly not in management of complex
| sites.
|
| A cursory look at caniuse.com should disabuse you of any
| notion of stagnation or lack of capability.
|
| Folks rewrite "every two years" because it gets better so
| quickly. Svelte/Solid/Qwik are clearly steps forward from
| React. Were you advocating for sticking with older stuff
| simply because you don't like change? Or did you think C,
| Unix, et al technologies were born fully formed as they
| exist today with no evolutionary steps (and missteps) in
| between?
| nirvdrum wrote:
| I think the charitable interpretation is there are a
| whole class of web sites/applications (news sites,
| e-commerce, etc.) that haven't appreciably changed in the
| last 15 years. Yeah, there's been some incremental
| improvement, but the core experience is the same.
| However, the versions today are generally resource hogs.
| It's not rare for me to leave a browser tab open and find
| it grows to > 2GB RAM. I had one hit 28 GB and it would
| have kept going if I hadn't killed it. One thing I miss
| on the M1 is the fan kicking on so I know when a browser
| tab has run away.
|
| I think the OP has a point. We've been building a
| massively complex ecosystem on a very shaky foundation.
| The web has indeed advanced, but most of the time it
| feels like duct taping something onto this monster of a
| creation. Between backwards compatibility and competing
| browser vendor concerns, it's hard to push meaningful
| change through. So many security (especially supply
| chain) and dependency issues in JS would go away if there
| were a reasonable standard library. But, there isn't, so
| we're doomed a large graph of inter-related dependencies
| that break regularly. The constant churn in frameworks is
| mostly related to novel ways to work around limitations
| in the web platform and bridging behavioral differences
| across browsers.
|
| It's more than just churn in frameworks though. It's
| depressing how many person-years have been spent on build
| systems and bundlers or CJS vs ES6. Participating in the
| open source ecosystem means needing to be familiar with
| all of it, so it's not enough to pick one and run with
| it.
|
| Prior to flexbox, people struggled to center content with
| CSS; it was bad for layout. There's a ton of old CSS out
| there, much of it accomplishing the same thing in
| different ways, often with vendor-specific options. You
| still run into this if you have to style HTML email.
| Given the difficulty in mapping CSS back to its usages,
| it's incredibly challenging to refactor. It requires
| familiarity with the entire product and a lot of guess
| work to intent since comments don't exist.
|
| Moreover, there have been massive changes in "best
| practices". React and Tailwind violate the previous
| principles of unobstructive JS and semantic naming that
| were the prevailing practices not too long ago. My
| cynical take is a lot of that is driven by consultants
| looking to stand out as thought leaders. Regardless, it
| adds to the complexity of what you need to know if you do
| have to work with older (often other people's) code.
|
| I'm fairly confident that if we had today's use cases in
| mind when designing the foundational web technologies
| we'd have a very different platform to work with. It
| almost certainly would be more efficient, less error-
| prone, and likely more secure. It'd be nice if we could
| take a step back and revisit things. A clean break would
| be painful, but could work. Developers are already
| building web apps that only work on Chrome, so much as it
| pains me to say, having to get all the vendors on board
| isn't entirely necessary.
| solarkraft wrote:
| > React and Tailwind violate the previous principles of
| unobstructive JS and semantic naming that were the
| prevailing practices not too long ago. My cynical take is
| a lot of that is driven by consultants looking to stand
| out as thought leaders.
|
| You're missing that people are adopting this on their own
| because they enjoy the benefits during development.
|
| This is also my explanation for why apps with the same
| complexity as 15 years ago are now slower: What Andy
| Giveth, Bill taketh away. Developers (or rather their
| management) are choosing to make "lazy" trade-offs, i.e.
| prioritizing programmer productivity over execution
| performance. Which makes economical sense.
|
| I wish there was a way to shift the economic incentives
| because that would change things very quickly. Maybe some
| browser performance limits.
| kitkat_new wrote:
| > CSS is not hard.
|
| CSS is more difficult than C++
| Y_Y wrote:
| The tone of this post really rubbed me the wrong way. I know
| nobody reads the guidelines, and that this post violates them
| even more, but I really like the civilised interaction norm
| here and when otherwise earnest and substantive replies come
| in form that appears to me combative and unempathetic then
| it's quite disappointing.
| tjoff wrote:
| Alternatives we have?
|
| Regular good old html without or with _minimal_ amount of
| javascript. Absurdly fast compared to todays site. Many
| orders of magnitude less power.
|
| Faster to develop and better user experience too.
|
| But I also think we should think about alternatives. Things
| like gemini are interesting but will have a hard time go
| mainstream. But I do believe we need something completely new
| because the user hostile nature of the web today is so
| devastatingly toxic that I see no hope for the future.
| LeFantome wrote:
| It depends on what you mean by user hostile.
|
| Gemini is cool but probably just repeating the pattern. If
| companies and users wanted that, HTML use would have gone a
| different way.
|
| The truth is that both consumers and producers of even
| "simple", "static" content want a lot more than these kinds
| of solutions offer. We have web fonts, animated gradients,
| SVG logos, responsive layouts, and the like because people
| want them. Choosing another tech that lacks them will only
| result in them all getting added back again.
| pferde wrote:
| The best thing would be to split web _pages_ and web
| _apps_. Webpages can keep using HTML with some
| minimalistic, gracefully degrading javascript, while
| webapps need something better than hacks upon hacks upon
| hacks upon ... on top of technology that was meant to
| display static interlinked documents.
| elementalest wrote:
| Isn't that whats happening to a certain extent with wasm?
| solarkraft wrote:
| I don't see how
| solarkraft wrote:
| > I do believe we need something completely new because the
| user hostile nature of the web today is so devastatingly
| toxic that I see no hope for the future.
|
| Capitalism ate the HTML web and it's native to expect that
| it wouldn't eat the gemini web given there's a profit to be
| made. The thing that makes an alternative technology non-
| hostile is simply that it's too unpopular to make a profit
| from making it suck.
| ttfkam wrote:
| > Regular good old html without or with minimal amount of
| javascript.
|
| You just described Svelte. All the JS minimalism and speed
| without sacrificing the power.
| emptysongglass wrote:
| (Not a web developer just curious): does Astro also offer
| this?
| cdogl wrote:
| > Websites do more or less the same thing they did 15 years ago
|
| As a tech enthusiast, I've paid close attention to what tasks I
| can achieve on my PC (and eventually smartphone) over the past
| 15 years. In my experience, this is simply not true.
| ThunderSizzle wrote:
| What has changed significantly on the website side?
|
| My bank website did a full re-design to be modern looking,
| but it's slower, no longer supports multiple tabs, and links
| don't work well, especially if you go through the sign in
| flow. The ones that didn't redesign into a JS heavy spa are
| still faster and links tend to be more reliable for e.g.
| bookmarking. They don't look great, but they tend to be more
| functional.
| jeroenhd wrote:
| CSS has become much easier. CSS grids and flexbox have removed
| all the stupid layout hacks from long ago. No more need for
| HTML tables to get two elements to align. The CSS spec has also
| improved a lot.
|
| JS has also improved massively, though it can't rid itself of
| its original flaws it has solved a lot of problems in fifteen
| years.
|
| Accessibility has become trivial. You really don't need all
| that much knowledge about screen readers anymore. If you follow
| the standard even just a little, screen readers will already
| Just Work.
|
| HTTP/3, WebP, Brotli, and a whole range of technologies have
| made some of the worst pain points of web development go away
| for free. No more messing around with connection pipelining,
| ordering resources to work around the browser load order,
| minifying text resources manually, you just let the tech stack
| do its thing and it'll work fine.
|
| The problem isn't the web stack, it's the framework of the week
| throwing out the last five years of development as "bloat",
| redefining how things Should Be, and then growing to a form
| where it's considered "bloat" and replaced again.
|
| I've started my web dev career with PHP on the server. It's
| really not that bad. The web works fine.
|
| Things that work fine are boring, though. Writing HTML and
| wiring basic actions to it is tedious work when there are all
| of these interesting frameworks and methodologies to try out.
|
| Or, in some cases, people skip the basics and start with
| learning React or another heavy Javascript framework. Who needs
| to know the difference between a <nav> and a <div class="nav">
| when you only have two weeks to get through the bootcamp?
|
| It's really not all that difficult to make websites. Web apps
| are even easier because you can demand Javascript and all of
| its tooling to be present.
|
| We've tried replacing the web with apps on phones. As it turns
| out, that's just as hard, often even harder. The actual
| problems that make web development hard just aren't easy
| problems to solve. Throwing complex layers of other people's
| code over them sometimes helps, but in most cases anything that
| promises to make web development easy just moves the complexity
| some place else.
| tgtweak wrote:
| A quick test-run on the pages linked from the hackernews front
| page:
|
| hackernews svg logo doesn't render
|
| youtube embed on the servo homepage doesn't load
|
| https://github.com/tpope/timl -> reloads/rerenders infinitely,
| uses 100% of 1 core
|
| https://akrzemi1.wordpress.com/2023/04/23/the-obvious-final-...
| -> hangs while loading the page, uses 100% of 1 core
|
| https://equalitytime.github.io/FlowersForTuring/ -> loads/renders
| fine
|
| https://github.com/reactos/reactos -> reloads/rerenders
| infinitely, uses 100% of 1 core
|
| https://dolphin-emu.org/blog/2023/05/27/dolphin-steam-indefi...
| -> loads fine but css rendering/placement is off on breadcrumbs
| and article
|
| https://xcp-ng.org/blog/2023/05/23/new-xen-updates-on-risc-v...
| -> very long load then hangs while loading javascript/content.
|
| Not a great batting average - seems like sites need to be purpose
| built for this or leverage little/no modern javascript libraries
| for it to render accurately. Servo has been in this state for the
| last 5 or 6 years since I first discovered it.
| brundolf wrote:
| It isn't production-ready and I don't think it claims to be
| pohl wrote:
| Over how many of those years was there any active development,
| though? It just re-started recently.
| dathinab wrote:
| and before it was restarted it wasn't even meant to ever
| become a full fledged stand alone browser at all...
| dathinab wrote:
| it's not a full fledged browser for now, never was intended to
| become one, maybe now will
|
| it's doesn't even call itself a browser but a "web rendering
| engine"
|
| a lot of things today need a lot of not-quite standard, messy
| or standardized by very very complicated stuff to even render
| correctly
|
| it wasn't really maintained for a long while until very
| recently (because it originally was never meant to become a
| full browser)
|
| so I'm not sure what you are expecting
| kramerger wrote:
| How does that compare to ladybird?
|
| https://awesomekling.github.io/Ladybird-a-new-cross-platform...
| insanitybit wrote:
| Servo will forever be a testament to Mozilla's absolute failure
| as an organization. Finally there was a renewed interest in
| Firefox _as a technology_ and their dipshit CEO fired them all
| "because covid" while taking their largest ever bonus.
|
| I hope that it can become something more than that, but I'll
| always remember it.
| surajrmal wrote:
| Technology doesn't matter if it doesn't convert users. Perhaps
| the improvements were too small given the level of investment
| necessary and given the bleak outlook, they decided they
| couldn't keep it up. I am happier knowing Mozilla will live to
| see another day than see it die off.
|
| As far as pay goes, the Mozilla board ultimately is in charge
| of setting that. I'm not sure it makes sense to blame the CEO
| for taking the compensation they were promised. The fact it is
| correlated with layoffs just speaks to the large incentives
| those in charge throw at the CEO to perform layoffs when
| necessary.
| ohgodplsno wrote:
| the Mozilla Board is a living proof that the following
| keywords in absolutely any leadership position are a death
| sentence: Stanford, Harvard, McKinsey. The Mozilla board are
| sycophanths that prop eachother up in the various boards they
| are each a part of to leech off as much as they can before
| jumping ship to another.
|
| https://www.mozilla.org/en-US/about/leadership/boards-of-
| dir...
|
| None of these people have demonstrated any leadership
| ability, nor justified a single dollar of their pay.
| ptx wrote:
| The technology is what matters to users. Mozilla only matters
| in that they develop and publish the software that users care
| about.
|
| If the software is forked and development is continued as an
| open source project under the auspices of another
| organization, that works just as well for users. Conversely,
| if the software dies but the Mozilla Corporation survives as
| the world's leading manufacturer of paperclips, that's of no
| value to the software's users.
| Onavo wrote:
| Who cares if their CEO did not invent JavaScript, she donated
| to politically correct causes, shook the right hands, and
| virtue signaled at the right time. As far as Mozilla is
| concerned, they got the leader they deserved.
| refulgentis wrote:
| It's not so much this Woke Right view that everyone deserves
| 0 consequences in every environment, and lefties have 0
| consequences, but the inverse.
|
| Bad idea to fight gay marriage when you run a company, not
| very respectful or inclusive and causes a lot of headaches
| for board, HR, and legal.
|
| Nobody really cares about the new CEO as long as they don't
| do that. The CEO that replaced Brendan wasn't a she, you're
| trying to connect events 6 years after Brendan to Brendan.
| erk__ wrote:
| It should be noted that development has been getting up in gear
| again mostly by Igalia after some external funding.
| fmiras wrote:
| This is awesome to hear, thanks Igalia
| pabs3 wrote:
| Who are the new funders for Igalia's work?
| capableweb wrote:
| > Igalia is a private, worker-owned, employee-run cooperative
| model consultancy focused on open source software. Based in A
| Coruna, Galicia (Spain), Igalia is known for its
| contributions and commitments to both open-source and open
| standards. Igalia's primary focus is on open source solutions
| for a large set of hardware and software platforms centering
| on browsers, graphics, multimedia, compilers, device drivers,
| virtualization, embedded Linux, and device drivers.
|
| https://en.wikipedia.org/wiki/Igalia
|
| https://people.igalia.com/mrego/servo/igalia-servo-tsc-2022/
|
| Seemingly, they are funding this work themselves, probably
| from income from the consultancy.
|
| Edit: Also, seems Igalia received public funding, the latest
| one in 2022. In Spanish: https://www.igalia.com/public-
| funding-projects/public-fundin...
| afiori wrote:
| From what I gathered from a few podcasts it is mostly
| employee driven championing of priorities and croudfunding
| HyperSane wrote:
| I wish most companies were run this way.
| password4321 wrote:
| Maybe "this is the way" to fund an alternative browser!
| ktosobcy wrote:
| One wonders if there would be more user donations to
| Firefox if those were directly going to browser
| development and not to mozilla...
| orra wrote:
| I don't see how this Mozilla hate is relevant. Igalia
| aren't ordinary users: they're developers.
| samsquire wrote:
| Is compositing different bitmaps an embarrassingly parallel
| problem?
|
| I started writing a simple layout engine based on the ORCSolver
| white paper but I wonder how layout can be parallelized.
|
| https://GitHub.com/samsquire/browser
|
| I always thought servers could prelayout the HTML to determine X
| and Ys and widths for different browser viewport sizes and this
| metadata would speed up layout because you have a good candidate
| for layout immediately.
|
| When it comes to changing the DOM and reflow, I wonder if you can
| reflow against the row only into chunks of the screen and
| recomposite bitmaps to move elements down the screen.
|
| Some web pages are really pathological with relayout costs
| nextaccountic wrote:
| > Is compositing different bitmaps an embarrassingly parallel
| problem?
|
| > I started writing a simple layout engine based on the
| ORCSolver white paper but I wonder how layout can be
| parallelized.
|
| Well Servo has webrender and stylo (now adopted in Firefox)
| which is aimed at harnessing the available parallelism to do
| those tasks
| chrismorgan wrote:
| CSS Containment <https://developer.mozilla.org/en-
| US/docs/Web/CSS/CSS_Contain...> defines ways of optimising a
| lot of calculation and recalculation, roughly by forbidding
| various of the harder or slower cases. (It's approximately just
| an optimisation hint, though it does change behaviour if you
| try to break out of the containment you declare. It's
| conceivable that in the future browsers could determine whether
| these optimisations can safely be done without the need for the
| hint, though that could only help _relayout_ , never initial
| layout.) Some forms of containment that you can declared
| definitely allow for parallel layout to work without it being
| speculative.
|
| > _I always thought servers could prelayout the HTML to
| determine X and Ys and widths for different browser viewport
| sizes and this metadata would speed up layout because you have
| a good candidate for layout immediately._
|
| They tried this with <img srcset>, where it can have a lot more
| value since you can fetch a lower-quality version of the image
| if you don't need more pixels. In practice, effective use is
| somewhere between uncommon and extremely rare, depending on how
| you want to classify things. Some reconceptualisation of that
| that integrated with container queries
| <https://developer.mozilla.org/en-
| US/docs/Web/CSS/CSS_contain...> might have more potential. But
| all up, things like media queries, viewport units and calc()
| make any kind of "serialise the layout" concept extraordinarily
| difficult to do to any useful degree, and for almost no
| benefit.
| TuringTest wrote:
| > Is compositing different bitmaps an embarrassingly parallel
| problem?
|
| If you can make an initial approximate guess of the area for
| each section, you can split regions with lower and upper size
| constraints that can be laid out in parallel. A global branch
| and bound algorithm can coordinate those processes, tightening
| the upper constraints of one region when another connected
| region exceededs its minimum bounds.
|
| Indeed there still may be degenerate cases where multiple
| regions need to be recalculated several times, but overall the
| default case should be able to finish without much conflict.
| benreesman wrote:
| I've been out of the browser rendering engine game for a long
| time.
|
| But even back in the day, the various ways that a DOM element's
| physical layout could escape its parent's bounding box, the
| multiple stages of coordinate evaluation, the zillion edge
| cases: it's hard.
|
| And that's assuming that all inputs are perfectly formed. By
| the time you've cranked in enough Poe's Law to be more than a
| tech demo, and enough performance to interest anyone?
|
| It's one of the nastiest practical problems you'll encounter in
| a long career and the cross product of the kooky standard and
| the bugs you have to honor on the real web, a back compat story
| worse than Win32 mean that there's no One Elegant Answer.
|
| It's a fucking mess and my hat is tipped to the hard-ass
| veterans who can even make a credible attempt.
|
| I'd rather be on the hook for a modern RDBMS than a modern
| browser.
| Sesse__ wrote:
| > I'd rather be on the hook for a modern RDBMS than a modern
| browser.
|
| I've worked on both a RDBMS and a browser, and yes, I'm
| inclined to agree. In particular, it is largely acceptable
| for an RDBMS to pick a niche where you perform well, and do
| worse in others--and you're (silently) allowed to support
| only a given subset and have your own incompatibilities.
| Nobody would ever use a browser that performs well on NYT but
| could hardly run YouTube (or the other way around).
|
| Interestingly, there are very small teams who have written
| usable instances of both.
| benreesman wrote:
| It's quite a career that includes both! Congratulations on
| an interesting professional life!
|
| Are you at liberty to share any inside baseball about the
| small teams you mentioned?
| Sesse__ wrote:
| I haven't been part of a small team building either, but
| the obvious small team doing RDBMSes is SQLite. Three
| people building the world's most popular software, and
| possibly the world's most robust database. They found a
| niche and implemented it perfectly.
|
| For browsers, the most prominent example is probably
| LibWeb/Ladybird, part of SerenityOS (but with a Qt port,
| allowing it to run on other operating systems). I've
| never tried it myself, but supposedly it's complete
| enough to run large web applications fairly acceptably,
| which is amazing for something developed largely by
| basically two people AIUI.
| knome wrote:
| >By the time you've cranked in enough Poe's Law to be more
| than a tech demo
|
| >Poe's Law
|
| What? What is this trying to say here? Is it a typo?
| ptx wrote:
| Perhaps referring to supporting web pages that are so
| poorly implemented that you suspect the developer did it as
| a joke? :)
| TUSF wrote:
| See the previous paragraph, describing things that "could
| go wrong."
| jeroenhd wrote:
| I've been watching the SerenityOS videos and especially the
| Ladybird videos show a decent chunk of spec. One man (okay,
| one man with a history in WebKit development) and a bunch of
| volunteers gave built up a browser that could almost be
| competitive if it existed back in the day. Of course the
| people working on the project are incredibly talented, but
| I'd argue that if one full time employee developing both an
| operating system and a browser can get this much done with a
| community, the problem isn't as hard as many people deem it
| to be.
|
| There's a lot that can go wrong, but the spec describes much
| more than I remember reading back in the day. Layout
| algorithms and such are all spelled out. There's no arcane
| knowledge in finding out what the right size of an <img> is,
| there are rules that will tell you exactly what you want to
| know if you follow the spec.
|
| Not everything has been documented as well as it should be,
| but the vagueness and complexity of the web stack devs often
| like to lament about really isn't that difficult. It's a lot
| of reading, and boring reading at that, but the problems all
| seem relatively straightforward to me.
|
| Even quirks mode has been largely documented. A spec for
| things that don't follow the spec!
|
| I know things were very hard back when Internet Explorer was
| still a major browser because there were no rules. These
| days, you can either find your answer in the spec, or the
| publicly available source code for your competition.
|
| With the amount of features being crammed into database
| systems these days, I'm not sure what I would prefer to
| maintain. A modern RDBMS sure seems less complex than a
| browser, particularly because there's no user facing UI, but
| the problems with an RDBMS are a lot more about (inventing)
| complex algorithms than many of the modern browser
| challenges.
| hexo wrote:
| > Some web pages are really pathological with relayout costs
|
| I'm always wondering how much do these awesome pathological
| sites contribute to CO2 emissions. Along with ADs, autoplay
| videos, huge and slow JS libraries, _animations_ , and other
| needless and outright unwanted content. Even more on mobile.
| I'd say it's a lot and could be cut off immediately with very
| positive impact and no real loss.
| coldtea wrote:
| [flagged]
| GeertJohan wrote:
| Not sure if sarcasm or..? Can you elaborate?
| botanical wrote:
| How come since Mozilla let this go, it gets so much engagement
| through opening of issues? Which types of people are using this?
| ktosobcy wrote:
| well...look at thunderbird - mozilla basically let it go as
| well and it made spectacular come back with a huge funding from
| users / community... as if user do like to contribute directly
| to the software development and not the mozilla org with it's
| weird golas... food for though...
| capableweb wrote:
| Mozilla laid off ~25% of their workforce, the Servo team was
| mostly (maybe even all?) fired in that layoff.
|
| https://www.cnet.com/tech/computing/mozilla-cutting-250-jobs...
|
| Comments at the time:
|
| https://news.ycombinator.com/item?id=24128865 (Mozilla Fires
| Servo Devs and Security Response Teams)
|
| https://news.ycombinator.com/item?id=24120336 (Mozilla lays off
| 250 employees while it refocuses on commercial products)
|
| In the end, I think Servo is better served by being in the
| Linux Foundation rather than under Mozilla, as Mozilla seems to
| stray further and further from the path they were walking a
| decade ago, sadly.
| wongarsu wrote:
| Servo seemed like a great project for Firefox. Both in the
| near term, by merging parts of servo like the CSS engine, and
| in the long term by promising a much better rendering engine
| than anything else on the market.
|
| But sadly good for Firefox doesn't mean good for Mozilla.
| Making a great browser isn't necessarily their highest
| priority.
| kvark wrote:
| The CSS engine as well as new renderer from Servo has made
| it into Firefox successfully. Servo has served well.
| the_third_wave wrote:
| [dead]
| doublerabbit wrote:
| Just to think, if every developer got together and contributed a
| single line of code to a new internet stack/browser/protocol
| project we could overtake the corporation poison that's current
| spreading within the web-space.
| ReactiveJelly wrote:
| But, and I mean this un-ironically, what's in it for me?
| o1y32 wrote:
| If there is such a project where every developer can contribute
| one single line, the result is inevitably that the codebase is
| a horrible unmaintainable mess. Sorry this is the stupidest
| thing I have seen this morning.
| doublerabbit wrote:
| Lack of coffee. Sounded better in my head.
|
| The dream of that everyone could get together and make an
| internet utopia without corporate input would be a pleasing
| moment. That was the illusion I was trying to make.
|
| The internet is so bogged down by corporate walled gardens
| that nothing is individual anymore. Not to say they don't
| exist but have an very minute presence.
| meindnoch wrote:
| Ok, here's my contribution: exit(1)
| fooker wrote:
| Cautionary tale for the 'rewrite it in Rust' camp
| sebzim4500 wrote:
| I think it's mainly a cautionary tale for the "let's write a
| browser engine" camp.
| pavlov wrote:
| If anything, it's an encouraging tale for the "let's invent a
| new language for our rewrite of a complex application" camp.
|
| The rewrite failed but the language lives on because the
| problem was general enough.
| fooker wrote:
| Most programming languages were born out of such a need,
| not necessarily just the successful ones.
| cookieperson wrote:
| Rust is doing just fine. And your original comment is
| specieous at best.
| GeekyBear wrote:
| > The rewrite failed
|
| Servo code, written in Rust, to enable the use of multiple
| CPU cores to speed up rendering was merged into Firefox
| quite a few years ago.
|
| https://hacks.mozilla.org/2017/08/inside-a-super-fast-css-
| en...
|
| Also, shout out to Lin Clark, whose blog posts for Mozilla
| back in the day set a high bar.
| pavlov wrote:
| Fair enough. My impression was that Servo didn't meet its
| original goals. Whether that counts as failure is a
| mindset question, I suppose.
| kzrdude wrote:
| Some of the best things in Servo were taken over by
| firefox, weren't they?
| cookieperson wrote:
| Yes.
| capableweb wrote:
| > The rewrite failed
|
| What rewrite are you talking about?
|
| Servo was never intended to replace Gecko, so it can't be
| that.
|
| Servo was always in Rust, so you're not talking about a
| rewrite there.
|
| Servo delivered on its promise to be a sandbox for
| experiments that might end up in Firefox, so surely you're
| not talking about that either.
| tannhaeuser wrote:
| Was about to write it's a cautionary tale for "let's
| procrastinate to write x by going meta and bikeshedding on
| tools and languages" camp but pavlov beat me to it,
| interpreting what he wrote as sarcasm.
| jeltz wrote:
| Depends. It made Forefox lose marketshare but it also gave
| us Rust.
| jeltz wrote:
| Not really. The Servo project delvireded som amazing results
| which made Firefox's rendering much faster.
| fooker wrote:
| Firefox has been consistently slower than most other browsers
| for the last fifteen years.
| earthling8118 wrote:
| That's just not true. Over 10 years ago I switched to
| Chrome because it was much faster but it's been nearly 5
| years since I switched back because Firefox was blowing it
| away
| jeltz wrote:
| Nope, that has always been highly workload dependent.
| GeekyBear wrote:
| > Cautionary tale for the 'rewrite it in Rust' camp
|
| Mozilla, famously, made multiple attempts to update Firefox's
| rendering engine to take advantage of multiple CPU cores that
| had to be abandoned before they switched over to Rust and
| started to see some success.
|
| >Parallelism is a known hard problem, and the CSS engine is
| very complex. It's also sitting between the two other most
| complex parts of the rendering engine -- the DOM and layout. So
| it would be easy to introduce a bug, and parallelism can result
| in bugs that are very hard to track down, called data races.
|
| https://hacks.mozilla.org/2017/08/inside-a-super-fast-css-en...
| kvark wrote:
| Reminds me of the famous "you have to be this tall to write
| multithreaded code" poster :)
| o1y32 wrote:
| Rust may have helped but I doubt that's the deciding factor.
| This needs more context
| GeekyBear wrote:
| There is already a link to an interview with Josh Matthews,
| who led Servo development, where he makes the case that
| moving to Rust from C is the factor that finally allowed
| the effort to succeed after three previous failed attempts.
|
| https://news.ycombinator.com/item?id=36093636
| Terretta wrote:
| The recent overall assessment work is fantastic. Every project
| should do this from time to time, as a heads up reset and
| refocus.
|
| I'll be using the report comparing layout engines as an exemplar
| for teams thinking about undertaking or already working on
| migrations.
|
| Overall report:
|
| https://github.com/servo/servo/wiki/Servo-Layout-Engines-Rep...
|
| Feature by feature details:
|
| https://github.com/servo/servo/wiki/Layout-2020-and-2013-par...
|
| TL;DR on the report:
|
| Neither Layout 2013 nor Layout 2020 fully support widely used
| features like flexbox, grids, or a reliable float implementation.
| This report proposes focusing on improving Layout 2020, but until
| the project benefits from the renewed energy and focus it's not
| ready for projects needing these.
| nologic01 wrote:
| Could Servo be used to build desktop webview type apps that can
| leverage all the html/js/css libraries that are available?
|
| The main complaint people seem to have about this approach is the
| large size of the executable, I don't know if Servo can make a
| different in this respect.
| iknowstuff wrote:
| https://tauri.app/
| Al0neStar wrote:
| I'm not sure if it can support all the libraries but yes it can
| be used to make desktop apps. Theres also Sciter.
|
| https://sciter.com/
| wongarsu wrote:
| I really like sciter for webview-type apps. It's really
| lightweight, integrates well with native code, and brings
| some native-app concepts to JS (like the communication
| between multiple app windows). Sure, it's not open source,
| but you can have the source code if you pay for it, I can
| respect that.
|
| My main issue is that it's lacking some of the DOM API. It's
| complete enough that you wouldn't really notice when writing
| code yourself, but good luck finding a charting library that
| runs without modifications.
| throwaway290 wrote:
| It's not open source and there is nothing about Servo from
| what I could find
| Al0neStar wrote:
| Seems like they have plans for Q4 2023
|
| https://servo.org/about/
| conradludgate wrote:
| I was thinking about building a desktop app framework using
| servo, but skipping any html/css parsers and without any JS
| engine. It would use rust and some dsl/macros that compile to
| servo function calls that build the UI. I'm not sure how much
| smaller it would be, but I imagine not having node would be a
| good start
| TUSF wrote:
| I'd been wanting to see this, preferably with JS being
| optional, and just allowing direct DOM access.
|
| I initially thought this was what Azul was, but it's only just
| using Servo's WebRender compositor, and rolls its own CSS
| parser, DOM, and layout engine, so it doesn't benefit from most
| of the work done on Servo, and supports less CSS features.
|
| https://github.com/fschutt/azul
| dan-robertson wrote:
| I think it used to be a goal to be embeddable in the way you
| describe. Or at least better than gecko and at least as good as
| chromium of the time. I'm not sure if that goal changed. I
| think it probably doesn't yet support enough features to meet
| your requirements.
| umanwizard wrote:
| How would that differ from what you can already do with
| Electron?
| Al0neStar wrote:
| If you use Electron your app is bundled with Webkit + NodeJS
| and runs two different processes. Servo is like a
| minimal/limited Webkit and you dont need to write your app
| logic in NodeJS.
|
| A great example is Sciter and Sciters Note app
|
| https://notes.sciter.com/
|
| The x64 is 5mb impressive compared to an electron
| application.
| jmyeet wrote:
| Here is a good article on how important Servo was to the early
| development of Rust and the feedback loop between the two [1].
|
| I find it interesting to see how Rust chose run-time over
| compile-time. Some of this is completely understandable (eg the
| whole borrow-checking mechanism that is so valuable). Generally
| though, I've found you want to favor compile-time, particularly
| incremental compile-time. It's better to get something to work
| first and then optimize performance later than the other way
| around.
|
| [1]: https://pingcap.medium.com/the-rust-compilation-model-
| calami...
| avgcorrection wrote:
| > It's better to get something to work first and then optimize
| performance later than the other way around.
|
| That's what they did by using LLVM instead of making their own
| compiler backend. The compiler could have apparently (according
| to what have been said) been faster, earlier, if they didn't
| have to go through LLVM.
|
| But yeah, you're right about their priorities: minimizing
| runtime has always been a high priority while designing the
| language, while designing for fast compiles has not. At least
| according to my view from the peanut gallery
|
| I don't know anything about compilers, but it would have been
| interesting to see if a language like Rust would look different
| today in this regard if they focused on incremental compilation
| (instead of batch) from day one.
| ReactiveJelly wrote:
| OTOH, dev machines are usually beefier and fewer in number than
| prod machines.
|
| So if you can give yourself head room to make prod _really_
| fast, in a few years the dev systems may double in power
| anyway.
|
| Incremental compiles are a common source of bugs, right? I find
| it's easy to do something from scratch (like imgui painting the
| whole window every frame) and hard to optimize it not to (like
| damage in GUIs)
| cookieperson wrote:
| Thread ripper with nvme drives for rust development make
| compile times a giggle.
|
| Not sure what you mean by bugs due to incremental
| compilation? I've never seen it happen anyways.
| kbrosnan wrote:
| Happens with some frequency with building Gecko. You can
| see documented instances of it for the CLOBBER file. It
| allows the person building to optionally have the build
| system force a clean rebuild.
|
| https://hg.mozilla.org/mozilla-central/log/tip/CLOBBER
| https://firefox-source-
| docs.mozilla.org/build/buildsystem/mo...
| jeltz wrote:
| > It's better to get something to work first and then optimize
| performance later than the other way around.
|
| As far as I gather that is one of the issues why Rust is slow
| to compile. Fast compilers were usually built with performance
| in mind from the very start while rustc was first made work and
| then optimized.
| phkahler wrote:
| Having the ability to turn off expensive compiler options is
| nice. For Solvespace (C++), turning on LTO make compilation
| about 6x longer for maybe 15 percent runtime performance. I do
| development without LTO and let releases get LTO. The borrow
| checker in Rust might be to important to turn off even
| sometimes.
| xavxav wrote:
| > The borrow checker in Rust might be to important to turn
| off even sometimes.
|
| Variations of this point come up often, that the main reason
| rust compile time is slow is typechecking or borrowchecking,
| but that's not true, while those operations are expensive,
| it's monomorphization (and llvm) that are the main drivers of
| compile time currently. Building a compiler on top of llvm is
| unfortunately going to be sluggish for feedback loops ,
| regardless of your type system.
|
| Also, unlike LTO, borrow checking is not just an
| optimization, turning it off would be like allowing you to
| add arrays and booleans together: meaningless and wrong.
| Anyways, you wouldn't save more than ~10% time.
|
| Apart from the rant above, I do agree that having the ability
| to tune optimization levels for development vs prod is good,
| I would like it if it were easier to do stuff like PGO or
| other stupidly expensive optimizations for releases.
| est31 wrote:
| The main competitor of Rust for writing browsers is C++, and
| its compilers don't have in-unit incremental compilation. So
| that's why Rust didn't start out with incremental compilation,
| but it was added later on, so now you do have it (although it
| could be improved in many ways).
| cookieperson wrote:
| There's a balance between the two and that balance shifts based
| on product/project requirements. I can't imagine anyone writing
| a slow browser engine and being like "yea but we will make it
| fast later!", Screen tearing, 5second plain text page loads,
| etc don't make for compelling demos. Similarly for an OS or a
| database you can't have it be below some reasonable performance
| threshold even in prototyping. Otherwise you aren't even
| solving the problems, the apparent remaining 10% of the work
| basically demands a complete rewrite plus many new feature
| editions - ie youve done nothing.
| easton wrote:
| > I can't imagine anyone writing a slow browser engine and
| being like "yea but we will make it fast later!"
|
| Funny, that's the exact opinion of the people building
| Ladybird for SerenityOS. They are of course doing it for fun,
| and it doesn't take five seconds to show text.
| cookieperson wrote:
| I wish them every bit of success but I'd worry about their
| OS if it was built under the same principles.
| eis wrote:
| I think it would be great if the homepage could have a prominent
| link to the documentation as it wasn't obvious to me where to
| find it. I followed the link to Github which in the readme has a
| howto for building the project and there is a docs folder which
| has a few documents mostly with dealing how to get started
| contributing.
|
| In the end I think a good entry seems to tbe the Wiki on Github:
| https://github.com/servo/servo/wiki
|
| I would also love to have a clear overview page that shows a high
| level view of the state of the engine in terms of feature
| support/completeness.
|
| I'd love to see the project getting revitalized and gain traction
| now that it is outside Mozilla and maybe maybe eventually a new
| phoenix might rise.
| BSEdlMMldESB wrote:
| [flagged]
| paulrouget wrote:
| Rust helped Servo. But Servo also helped Rust:
|
| I encourage people to listen to Josh Matthews interview [0]. He
| talks about how, early on, Servo was a "guiding light for Rust".
|
| 0:
| https://podcasts.google.com/feed/aHR0cHM6Ly9ydXN0YWNlYW4tc3R...
| GuB-42 wrote:
| Rust is to Servo what C is to UNIX. Even though they live their
| own lives now, the ties are strong.
|
| I think programming languages must be developed with their
| application. For Rust, it is Servo, Go is for Google, LUA is
| for scripting video games, Swift is for making iOS apps, and of
| course C is for UNIX.
| SanderSantema wrote:
| Lua wasn't made for scripting video games but as a scripting
| language for a mining company if I remember correctly from
| this excellent paper[1]. I can highly recommend all the other
| ACM hopl papers too for those interested in the history of
| programming languages. The papers don't seem to be open
| access but sci-hub might help with that if you're so
| inclined.
|
| [1] https://dl.acm.org/doi/10.1145/3186277
| carterschonwald wrote:
| The hopl paper on lua seems to indicate it came out of a
| Brazil university. https://www.lua.org/doc/hopl.pdf
| airstrike wrote:
| Indeed. It was created at PUC-Rio, my alma mater
| tomjakubowski wrote:
| Lua means moon in Portuguese! (cognate of Luna)
| zendist wrote:
| This one? https://www.lua.org/doc/cacm2018.pdf
| osigurdson wrote:
| I once met Waldemar Celes, one of the co creators of Lua.
| It was meant for use within a tool called Geresim for
| visualizing large 3D models of oil/gas reservoirs -
| primarily for Petrobras, the Brazilian national oil
| company.
| jf wrote:
| I'd love to see your other suggestions!
| tomphoolery wrote:
| Go is for DevOps, full-stop. :D
| mch82 wrote:
| Products too! Technology projects without an end-user often
| become vaporware because it's impossible to make design
| decisions without constraints. A real end-user adds
| constraints. Constraints enable good decision-making. It's
| much easier to answer a specific question ("is this a good
| decision for a bowser rendering engine") than an abstract
| question ("is this a good decision for software").
|
| I don't use Rust, but appreciate how it solves clear
| problems. I'm really impressed about the way key issues like
| packaging & distribution, documentation, and coding
| style/safety are addressed at the language level. I bet those
| features emerged from the need to address issues with the
| open source development model. Without servo, maybe those
| features don't make it.
| underdeserver wrote:
| Funny. During my time at Google (7 years, left earlier this
| year), I can count on one hand the number of people who used
| Go.
|
| The outside world, though, appears to have embraced it.
| no_wizard wrote:
| My understanding is it largely exists in the confines of
| GCP as it targets infrastructure mainly
| stiltzkin wrote:
| Ruby is to Rails.
| Alifatisk wrote:
| What do they mean by "parallel browser engine"? That it runs in
| multicore?
|
| Also, what's fhe goal with Servo? I am all for new browser
| engines to compete against Chromes monopoly!
| HellDunkel wrote:
| I think it was designed as the next level browser engine to
| work with firefox, making heavy use of gpus (probably for
| mobile). Then things got stuck.
| amelius wrote:
| Cynical prediction: by the time this project is finished, the
| entire rendering of browsers happens in the GPU.
| GuB-42 wrote:
| GPU probably won't be a good fit for the browser, except for
| purely graphical tasks, like drawing images, for which they are
| already used for.
|
| GPUs are good for massively parallel tasks, for example,
| shading every individual pixel on a 3D rendered scene, which is
| what they are designed for, bruteforcing hash functions, as it
| is done in cryptocurrency mining, or multiplying large
| matrices, as it is done in deep learning.
|
| But most of what a browser does is not massively parallel. JS
| is mostly single-thread, and layout, parsing HTML, or managing
| network connections may benefit from a bit of parallelism but
| we are not talking about thousands of simultaneous operations
| with no interdependence. And for that, GPUs suck, they don't
| have the caching, branch prediction, synchronization, etc...
| abilities of a CPU, and attempting to do that on a GPU will
| only bring it down while its overpowered math units will be
| idling.
| loa_in_ wrote:
| Some things that come to mind: layer compositing, layer
| transformations, animation (deferred from CPU)
| sebzim4500 wrote:
| GPUs are already doing all of those things in modern
| browsers.
| Dylan16807 wrote:
| Those sound like "purely graphical tasks" to me.
| GeekyBear wrote:
| Parts of Servo to enable the use of the GPU for compositing
| layers were merged into Firefox years ago.
|
| https://hacks.mozilla.org/2017/10/the-whole-web-at-maximum-f...
|
| It's a shame that Patrick Walton didn't get a chance to finish
| his work on the Pathfinder GPU renderer.
|
| >Pathfinder 3 is a fast, practical, GPU-based rasterizer for
| fonts and vector graphics using OpenGL 3.0+, OpenGL ES 3.0+,
| WebGL 2, and Metal.
|
| https://github.com/servo/pathfinder
| dralley wrote:
| I wonder if it would be easier to finish if based on a wgpu
| backend, rather than having 4 separate backends.
| mandarax8 wrote:
| Opengl3/es3/webgl2 might as well be the same backend
| Sesse__ wrote:
| The paint phase of rendering can easily happen on the GPU, and
| more and more stuff will move to the GPU as time goes by. The
| style and layout phases... unlikely.
|
| Think of a GPU this way: If you could reasonably launch a
| thousand threads on the CPU and get a benefit from it--i.e.,
| essentially, you have tens of thousands of largely independent
| work chunks (or more)--your problem might be well-suited to a
| GPU. A large web page contains a couple thousand DOM elements,
| and many of them depend on each other wrt. style, so it's not a
| good fit. (Stylo is parallelized, so it is capable of using a
| multicore CPU, but there's no way it can get a reasonable gain
| from 1000 cores.) Layout is also nontrivial to parallelize.
|
| However, paint is a different beast. In many cases, you can do
| per-pixel parallel work. So various forms of font work,
| line/shape rendering, compositing/blending, scrolling, video
| decoding... that's where the GPU is useful.
| psychphysic wrote:
| Is this now intended to be a full featured browser? That'd be
| great.
|
| Last I checked this was going to be a sandbox for Rust ideas
| with no intention of Gecko bring dropped by Firefox.
|
| It would be cool if it could at least be a "secure" window with
| heavy restrictions on add-ons etc. Not full featured, maybe not
| even much hardware acceleration. Just a single window that is
| super hardened.
| maksimur wrote:
| I hope not, unless GPUs become cheaper.
| capableweb wrote:
| How much cheaper can they get? Most laptops have CPUs/boards
| with integrated GPUs, and desktop GPUs can be bought for very
| cheap, as long as you don't buy the latest greatest edition
| of everything.
| rollcat wrote:
| It's hard to lose on this one: even an underpowered,
| integrated GPU will probably do a better job than an average
| CPU. Desktops have figured this out a while ago, both OSX and
| Windows have been rendering on the GPU for a very long time
| and it's not necessarily to enable any extra graphical
| effects, but just to make the interaction faster, smoother,
| and more energy-efficient.
| the_third_wave wrote:
| Who knows? A rendering engine will still be needed though so
| another prediction could be that Servo supports such GPU-based
| rendering.
|
| Maybe a better prediction would be for the CPU/GPU/MPU/APU/xPU
| distinction to become less relevant since processors will gain
| those capabilities? A future where processors come in chiplet-
| based packages with a few generic cores and some more tailored
| to massively parallel matrix processing maybe?
| abrztam wrote:
| Yes, ads running on the GPU are the future, unblockable.
| sebzim4500 wrote:
| Why are ads on the GPU harder to block than ads on the CPU?
___________________________________________________________________
(page generated 2023-05-27 23:00 UTC)