Oh this is nice to hear. It's always pleasant to read updates about Servo. I didn't know they started accepting donations on Open Collective and GitHub sponsors last year https://servo.org/blog/2024/03/12/sponsoring-servo/. I'm happy to contribute something.
FYI: I think their website indicates they pay the fewest fees on GitHub, so you may want to set up your donations there if anyone is still deciding between the two.
> I think their website indicates they pay the fewest fees on GitHub
The difference (https://servo.org/sponsorship/#donation-fees) between the donation going via GitHub/Microsoft and Open Collective (independent) is so small (96% VS ~91%) that I'd rather not centralize funding FOSS with someone who has kind of a shitty track record with it, like Microsoft.
Made more sense in the beginning of GitHub Sponsors when Microsoft was matching donations 2x, or whatever it was. But now? I don't feel like it makes much sense anymore.
Open Collective is a fully public organization, who lives and breathes FOSS.
Igalia is the real deal. Many companies that want bugs fixed or features added to web browsers hire Igalia to make those changes. They also maintain WebKit on Linux (gtk and wpe) https://planet.igalia.com/webkit/
Yeah, been seeing their site a lot when setting up cog and weston for an embedded kiosk thing. Also met a bunch of them at the OpennSource Summit in Vienna.
Every now and then you run into these small-ish expert consultancies that actually are the force behind a lot of open source.
That's interesting. One wonders what their future looks like after Google divests Chrome. Good to see that the knowledge base isn't entirely confined within Google.
Igalia is self directed and one of the few organizations I can see taking on technical leadership of the entire chromium project if/when Google divests Chrome.
I think that's intended. This indicates that there's a possibility it's default dead.
> Servo is a huge project. To keep it alive and making progress, we need continuous funding on a bigger scale than crowdfunding can generally accomplish. If you’re interested in contributing to the project or sponsoring the development of specific functionality, please contact us at [email protected] or igalia.com/contact.
> Let’s hope we can walk this path together and keep working on Servo for many years ahead.
I assume that they're hoping that the EU or an EU member-state steps up; or failing that, maybe a (probably-US) nonprofit or billionaire donor, perhaps a Laurene Powell Jobs or MacKenzie Scott type. To be clear, something like this very probably should happen. I'm heading to social media to shout into the void about this: dear reader, you should probably do this too, and use any other means you might have to steer the attention of decision-makers towards this.
That said, in the longer term the solution to the WWW"'s Too Big To Fork problem surely has to involve getting much more of the "specification" expressed precisely in declarative specification languages, so as to greatly reduce the handwork involved in generating a half-decent implementation.
A lot of private funding at these consultancies actually comes from pet features. Some company says "hmm, we sure do rely on XYZ feature a lot, would be nice if it were faster", they throw some money at a consultancy like Igalia, and then it becomes faster for everyone. No need for a big pot all at once, though I'm sure that'd be really nice.
I don't think a single big donation is a good idea. We're so used to seeing extreme wealth we don't event question it.
Once a big donation is given, you get to wonder what sort of influence that person (willingly or not) has had on the project. A much better model is a large amount of small donations, the incentive becomes to serve the maximum amount of these people.
I too would prefer that the funding come from a relatively hands-off source, like some EU pot, if possible. But I think that nearly any (reasonably likely) funding source would be preferable to letting Servo development fail.
Except that paulg's essay was about startups, not about an OSS project.
Servo has no "customers" as such. It has potential future project users and there may be a support/development economic return for those users to fund further work.
It's very similar to Rust the language. Rust itself is not a startup or a company product.
Correct. There are some shared components (notably Stylo, the style system), but Webrender is not one of them. Webrender is still maintained primarily by Mozilla as part of Firefox. If they have any plans to move to Vulkan/Metal/DX10/wgpu then I haven't heard of them.
It might be possible for Servo to go down the same route as Blitz and have pluggable rendering backends. If so then the wgpu-based renderering library we are using (Vello [0] - which is an exciting project in it's own right) could be an option. Servo is actively looking at potentially using this library to implement Canvas2D.
All this discussion (hate) on the effort to "rewrite in Rust"...
At the same time these projects are soooo promissing (to me -- it may be purely subjective).
By these projects I mean:
Servo and Verso
Redox OS
System76's COSMIC Desktop's EPOCH
RipGrep
Deno
Zig
tree-sitter
And lots of web dev libs and frameworks: Actix, Leptos, Dioxus...
Currently a web dev stack can run on Redox OS and use significantly less resources than Alpine! (and this stack has not even had the years of tuning Alpine had)
Rewriting things in Rust is a reasonable thing to do. I think the hate is for people who criticize existing software for being written in C on the grounds that hypothetically someone could rewrite them in Rust.
"I rewrote SQLite in Rust" would be praiseworthy (assuming it's true). "Why don't you rewrite SQLite in Rust?" is trolling.
I took a look at COSMIC and it really looks nice. I am not interest in it because it is written in Rust but it simply looks nice and the window management also looks promising. I hope to run it on my main machine soon.
Honestly either should be more than possible to do, although not sure how beneficial. It would certainly be very funny if Zig compiler would be implemented in Rust and, simultaneously, Rust compiler would be written in Zig
zig compiler does lots of things for speed that would push it well into unsafe rust, or unchecked rust (like using u32 index tags in arrays instead of pointers)
Of all projects for Mozilla, the supposed champions of the web, to abandon, it still blows my mind that they chose Servo to be the one to lay off the entire team for.
HN: "Mozilla has too many side projects that don't make the browser better"
Also HN: "Mozilla should spend more than a decade and tens of millions of dollars on a brand new browser engine that has no hope of replacing Gecko before it reaches 100% compatibility with a spec thousands (tens of thousands?) of lines long, not to mention the kind of "quirks" you see with websites in the wild, while they already lag behind Google with the browser engine they already have."
People like cool R&D projects, and that's understandable - I like Servo too. But the fact that it was really cool doesn't compensate for the fact that it was not going to be production-ready any time soon and in that light it's understandable why it was cancelled. While some parts of Servo ended up being so successful that they were merged into Firefox, a lot of what remained only in Servo (and not in Firefox) was nowhere close.
The layout component was by far the least mature of any part of Servo at the time (unlike Stylo and WebRender, I mean) and in fact it was going through the early stages of a brand-new rewrite of that component at the time the project was cancelled, partly because the experimental architecture ended up not being very suitable.
> that has no hope of replacing Gecko before it reaches 100% compatibility with a spec thousands (tens of thousands?) of lines long
When Servo was still managed by Mozilla, they were able to merge some components incrementally into the Firefox. Most famously, Stylo and WebRender were first developed in Servo. They could have kept Servo for experimentation and merge parts incrementally.
It may also have enabled better embedding supporting which is a weak point of Firefox compared to Chrome; which is a long-term solution to remain relevant.
I covered that. Sure, Stylo and WebRender were successful enough that they made it into Firefox, but the Layout component was very much not. Servo was in the middle of a clean-slate rewrite of the layout component because the initial architecture chosen in 2013 wasn't very good.
The CSS engine and rendering engine are a lot easier to swap out than the remaining parts.
Again, I get why people like Servo, but "in 10 years, maybe we'll be able to take on Electron" isn't that great of a value proposition for a huge R&D project by a company already struggling to remain relevant with their core projects.
> "in 10 years, maybe we'll be able to take on Electron" isn't that great of a value proposition
Perhaps not, but "in 10 years, we'll have a browser that's significantly faster and safer than the competition" is how you plan to still be relevant 10 years from now.
The browser engine is not what makes Firefox "relevant" or not. Their competitors are Apple, Google and Microsoft. The marketing budget for Chrome is larger than Mozilla's entire budget. "Google" is synonymous with the entire internet for a large fraction of the non-technical population. Every device you could buy on the market whether a PC, a tablet or a phone has one of their competitors browsers already pre-installed.
Their primary leverage is unique features and functional adblockers, neither of which is impacted by the layout engine.
And again, you're taking away resources from something that is already behind right now. The canonical example of massive long-term rewrites being a bad idea for the business is literally the precursor to Firefox. Gecko can be refactored in-place, including into Rust if they decided to do so.
> Their primary leverage is unique features and functional adblockers, neither of which is impacted by the layout engine.
Yes, unique features like being written in a memory safe language and depending on memory safe implementations of image and video decode libraries are exactly what I care about in an all-knowing sandbox which touches network services and runs untrusted code on my computer.
> And again, you're taking away resources from something that is already behind right now.
Disagree. You're talking about every Mozilla project that's not Servo. Firefox/Servo development is Mozilla's core competency. One which they've abandoned.
>depending on memory safe implementations of image and video decode libraries are exactly what I care about in an all-knowing sandbox which touches network services and runs untrusted code on my computer.
What does that have to do with Servo? Firefox has already been doing those things and continues to do them [0], they don't need to do them in Servo first.
We are specifically talking about the utility of rewriting a layout engine from scratch, rather than putting more resources into evolving Gecko - including rewriting small parts of Gecko in Rust incrementally.
>Disagree. You're talking about every Mozilla project that's not Servo. Firefox/Servo development is Mozilla's core competency. One which they've abandoned.
They obviously haven't abandoned it. It's not like they cancelled Gecko development too and are rebasing on top of Blink. Again, this is all just a philosophical debate over whether rewrites or refactors are more effective when it comes to the most core component of the browser.
Do you see those red and orange and green pie slices? 40% of the code. There, be memory errors. Approximately 70% of all errors in that code will be memory safety related and exploitable.
Mozilla continues to add new Rust to Firefox, despite discontinuing the Servo project. A big parallel rewrite is not the only possible approach to writing Rust. "Fixing it" does not have to look like Servo. In fact doing more incremental rewrites will improve the situation shown in that chart much faster than waiting 10 years for parity before doing the replacement.
I'm not responding further until you actually read and understand what I'm saying instead of flailing at a strawman.
> A big parallel rewrite is not the only possible approach to writing Rust.
A rewrite is the only way to convert the codebase to Rust or any other memory safe language. Whether that happens in parallel, piecemeal, or both at the same time is down to how well you use a version control system and structure your code. As has already been shown by sharing Servo code with Firefox.
A full rewrite is particularly useful with Rust, as the language wants you to structure your code differently than most C/C++ is structured. Doesn't make sense not to have one going if that's the plan. If you're going to rewrite the whole thing anyway, might as well do it in an idiomatic way.
Google has demonstrated that writing new code in a memory safe language still significantly improves the safety story of a codebase, even while keeping around the old code. Full scale rewrites are not the only option.
Yes, every line of C/C++ you can replace with a memory safe language in a critical codebase like a browser improves it's safety story. Which is exactly the reason replacing all of it is so attractive.
But just to offer another point, I also still run into memory leaks and other performance issues in long-lived Firefox processes which, based on my experience with Rust, would be unlikely to be a problem in a functional Servo. It'd be nice to have a browser I don't have to occasionally kill and restart just to keep Youtube working.
Are you suggesting that memory leaks don't happen in Rust? Not only has that proven not to be true but the language for some reason seems to define memory leaks as safe behavior.
> based on my experience with Rust
This suggests that you haven't encountered references cycles at your level of experience:
> Are you suggesting that memory leaks don't happen in Rust?
Their wording was "would be unlikely", rather than "don't happen". The affine(ish) type system, along with lifetimes, makes it so most objects have a single owner, and that owner always knows it is safe to deallocate.
> for some reason seems to define memory leaks as safe behavior
The reason is that Rust aims to prevent undefined behavior, and that is the only thing it defines as unsafe behavior.
Memory leaks cannot cause a program to, for example, start serving credit cards or personal information to an attacker. Their behavior is well defined (maybe over-complicated thanks to Linux's overcommit, but still well defined).
Rust does not protect against DoS attacks in any way. In fact it seems to enjoy DoSing itself quite a lot given how many things panic in the standard library.
whytevuhuni has done a wonderful job of saying the things I meant, clearer than I am able to articulate them. But I just wanted to point out that this is the first sentence of the documentation you linked:
"Rust’s memory safety guarantees make it difficult, but not impossible, to accidentally create memory that is never cleaned up (known as a memory leak)."
My sentiment exactly. Rust makes it difficult to accidentally create memory leaks. If you try hard to do it, it's definitely possible. But it's tremendously more difficult to accomplish than in C/C++ where it accidentally happens all the time.
That's an interesting take about a language that puts variables on the stack by default. The less you put in the heap, the less fragmented it gets. Heap fragmentation also does not account for the ever growing memory footprint of a running instance.
C requires malloc (and the heap) for anything that lives beyond the scope of the function lifetime. C++ adds smart pointers and copy/move semantics, but default behavior is still like C, and defaults matter.
It's the other way around; Rust is really good at tracking the lifetime of objects, and so Rust code is a lot more reckless with passing around pointers to stack-allocated objects through very long chains of functions, iterators, closures, etc, because it becomes obvious when a mistake was made (it becomes a compile error).
This makes it so that things that appear dangerous in C++ are safe in Rust, so for example instead of defensively allocating std::string to store strings (because who knows what might happen with the original string), Rust can just keep using the equivalent of std::string_view until it becomes obvious that it's no longer possible.
This (avoiding needless copies due to uncertainty of what the callee might do, e.g. caching a reference) makes sense but is not what the grandparent was suggesting.
It's exactly what I was talking about. Rust enables me to avoid using the heap when I would be forced to in C/C++. And thanks to the borrow checker ensuring safety in doing so, this extends to libraries and dependent code in ways not easily achievable in C/C++. The net effect is a profound reduction in heap usage by comparison.
I’m pretty sure if Firefox started beating chrome in speed benchmarks (because of a newer, more modern engine) they would be able to claw back some of their lost market share. Even normal people care about speed.
That would be hard to do with Google intentionally sandbagging things like YouTube (I'm thinking about them using a nonstandard version of web components, plus a super slow shim for Firefox, instead of using the standardized version that Chrome also supported).
Does anybody argue that Google is negligent for not doing a complete rewrite of Blink, rather than doing the same incremental software development as everyone else? Did they suffer from their choice to use WebKit in the very beginning rather than do their own thing?
Every time google kills a project they are bashed for it, there is (was?) even a website dedicated to projects killed by Google. And anyway the core of Google isn't chrome, it's Google search
I agree. Pretty much the main distinguishing feature of Firefox is that it doesn't use WebKit/Blink. Crazy of them to discontinue working on their own engine's future, especially when it was already yielding results.
I'm trying Firefox on Android at the moment and it's noticeably less snappy than Chrome. I wonder if Servo would have changed that.
I am always in bizarro world when I read comments like this. I swear I perceive firefox for android to feel snappier and smoother than chrome.
To be clear, I am not trying to claim you are wrong! It is the common wisdom that chrome is faster on android. But I swear, scrolling and page loading just feels faster on firefox. My only guess is the adblocker, but I think firefox is faster than brave, so who knows.
I wonder if other have a similar experience and can shed some light on the situation?
I just double checked by closing all tabs, killing the browser and then loading Hacker News. It's about 0.5s on Chrome and 1s on Firefox (roughly). That's a big difference.
Firefox is probably faster for ad-heavy sites, but it definitely isn't for sites without obtrusive ads.
That is the kind of results I would expect! Plus V8 (if that is still the JS engine in chrome) has always been faster. And android is probably the priority target for chrome at this point.
But on the occasional times I launch it (chrome), it just feels like it bogus down more often doing basic things. Someone once suggested that, paradoxically, it is slower because I don't use it very often. Something to do with ART and how the AOT compilation work on android
Why? They already have their own browser engine, what would they gain by creating another one? It's a browser company, not a Rust promotion company, from this point of view the decision was completely logical.
Is there anything wrong with Gecko architecture? So wrong that it's a major obstacle and cannot be changed? I don't know anything about its internals or browser engines in general, so I can't really comment on that, but what I do know from practice is that a complete rewrite is a very expensive and a very risky project, that will fail more often than not. There should be very serious arguments behind it, something a lot more serious than the age of the codebase or a new and shiny programming language.
It is a very big project and has a big potential for the classic double-free, use-after-free, null-dereferencing, one-off index errors etc. which they designed Rust to get rid of in the first place. I believe they had such a bug some months ago and that it was quite serious.
It's a lot easier to swap out the renderer or the CSS engine for a new one than it is the whole core of the browser engine.
Mozilla decided that replacing Gecko as-is was not reasonably likely to actually happen, and that further efforts towards Servo would be better made by continuing to evolve Gecko.
Still, I think Rust was designed for the style and scale of application that a Web Browser is. Foundational, but not kernel level, highly complex, with a wide feature set, performance is important (but not the most important) and high reliability/maintainability and quality is expected.
Building these kinds of apps was commonplace in the 90s/early 2000s: photo editing apps, word processors, IDEs, 3D modeling software etc. Maybe RDBMS count as well.
In practice Rust is mostly used by web people to gain clout - rewriting microservices, which are usually <10k, but very rarely above 100k LOC, and were originally written in a very slow language, such as Python or Ruby.
Had these projects started out in an uncool, but performant language, like Java, there'd have been very little reasonable justification for these Rust rewrites.
The main advantage is having a reliable type system and a codebase you can reason about and refactor fearlessly.
On the performance side, raw processing power is not that important as you are waiting for io all the time: the main difference is not having GC and GC spikes (if you are working at significant scale) and lower memory usage all around.
Honestly though: Why? Large chunks of the most important servo work is in Firefox now. Nobody else maintains even one web engine. What is the importance for the open web of Moz developing two?
Sponsoring them was a no brainer for me :)
The difference (https://servo.org/sponsorship/#donation-fees) between the donation going via GitHub/Microsoft and Open Collective (independent) is so small (96% VS ~91%) that I'd rather not centralize funding FOSS with someone who has kind of a shitty track record with it, like Microsoft.
Made more sense in the beginning of GitHub Sponsors when Microsoft was matching donations 2x, or whatever it was. But now? I don't feel like it makes much sense anymore.
Open Collective is a fully public organization, who lives and breathes FOSS.
Servo would only see 85.6% of my 5 USD/mo donation as I'm from Canada. If I used PayPal, that number would go down to 81.2%.
I do agree that I'd prefer Open Collective, fees being equal/comparable.
https://chimera-linux.org/
I've been looking for a while for Linux distro that's easy to build from scratch and customizable.
Every now and then you run into these small-ish expert consultancies that actually are the force behind a lot of open source.
They seem awesome.
The technical talent at Igalia runs deep.
They are self directed contractors.
They are also responsible for huge portions of chromium and fundimental/base opensource libraries.
If you can think of an open source library, it's highly likely Igalia had funded some development or bug fixes.
> Servo is a huge project. To keep it alive and making progress, we need continuous funding on a bigger scale than crowdfunding can generally accomplish. If you’re interested in contributing to the project or sponsoring the development of specific functionality, please contact us at [email protected] or igalia.com/contact.
> Let’s hope we can walk this path together and keep working on Servo for many years ahead.
https://paulgraham.com/aord.html
That said, in the longer term the solution to the WWW"'s Too Big To Fork problem surely has to involve getting much more of the "specification" expressed precisely in declarative specification languages, so as to greatly reduce the handwork involved in generating a half-decent implementation.
https://github.com/LadybirdBrowser/ladybird
Once a big donation is given, you get to wonder what sort of influence that person (willingly or not) has had on the project. A much better model is a large amount of small donations, the incentive becomes to serve the maximum amount of these people.
Servo has no "customers" as such. It has potential future project users and there may be a support/development economic return for those users to fund further work.
It's very similar to Rust the language. Rust itself is not a startup or a company product.
The economics are completely different.
Should probably have a "(2024)" appended to the title.
I hope Servo will eventually replace Chromium in QtWebEngine and other similar cases.
It might be possible for Servo to go down the same route as Blitz and have pluggable rendering backends. If so then the wgpu-based renderering library we are using (Vello [0] - which is an exciting project in it's own right) could be an option. Servo is actively looking at potentially using this library to implement Canvas2D.
[0]: https://github.com/linebender/vello
At the same time these projects are soooo promissing (to me -- it may be purely subjective).
By these projects I mean:
Servo and Verso
Redox OS
System76's COSMIC Desktop's EPOCH
RipGrep
Deno
Zig
tree-sitter
And lots of web dev libs and frameworks: Actix, Leptos, Dioxus...
Currently a web dev stack can run on Redox OS and use significantly less resources than Alpine! (and this stack has not even had the years of tuning Alpine had)
Rewriting things in Rust is a reasonable thing to do. I think the hate is for people who criticize existing software for being written in C on the grounds that hypothetically someone could rewrite them in Rust.
"I rewrote SQLite in Rust" would be praiseworthy (assuming it's true). "Why don't you rewrite SQLite in Rust?" is trolling.
Also HN: "Mozilla should spend more than a decade and tens of millions of dollars on a brand new browser engine that has no hope of replacing Gecko before it reaches 100% compatibility with a spec thousands (tens of thousands?) of lines long, not to mention the kind of "quirks" you see with websites in the wild, while they already lag behind Google with the browser engine they already have."
People like cool R&D projects, and that's understandable - I like Servo too. But the fact that it was really cool doesn't compensate for the fact that it was not going to be production-ready any time soon and in that light it's understandable why it was cancelled. While some parts of Servo ended up being so successful that they were merged into Firefox, a lot of what remained only in Servo (and not in Firefox) was nowhere close.
The layout component was by far the least mature of any part of Servo at the time (unlike Stylo and WebRender, I mean) and in fact it was going through the early stages of a brand-new rewrite of that component at the time the project was cancelled, partly because the experimental architecture ended up not being very suitable.
https://servo.org/blog/2023/04/13/layout-2013-vs-2020/
When Servo was still managed by Mozilla, they were able to merge some components incrementally into the Firefox. Most famously, Stylo and WebRender were first developed in Servo. They could have kept Servo for experimentation and merge parts incrementally.
It may also have enabled better embedding supporting which is a weak point of Firefox compared to Chrome; which is a long-term solution to remain relevant.
The CSS engine and rendering engine are a lot easier to swap out than the remaining parts.
Again, I get why people like Servo, but "in 10 years, maybe we'll be able to take on Electron" isn't that great of a value proposition for a huge R&D project by a company already struggling to remain relevant with their core projects.
Perhaps not, but "in 10 years, we'll have a browser that's significantly faster and safer than the competition" is how you plan to still be relevant 10 years from now.
Their primary leverage is unique features and functional adblockers, neither of which is impacted by the layout engine.
And again, you're taking away resources from something that is already behind right now. The canonical example of massive long-term rewrites being a bad idea for the business is literally the precursor to Firefox. Gecko can be refactored in-place, including into Rust if they decided to do so.
Yes, unique features like being written in a memory safe language and depending on memory safe implementations of image and video decode libraries are exactly what I care about in an all-knowing sandbox which touches network services and runs untrusted code on my computer.
> And again, you're taking away resources from something that is already behind right now.
Disagree. You're talking about every Mozilla project that's not Servo. Firefox/Servo development is Mozilla's core competency. One which they've abandoned.
What does that have to do with Servo? Firefox has already been doing those things and continues to do them [0], they don't need to do them in Servo first.
We are specifically talking about the utility of rewriting a layout engine from scratch, rather than putting more resources into evolving Gecko - including rewriting small parts of Gecko in Rust incrementally.
>Disagree. You're talking about every Mozilla project that's not Servo. Firefox/Servo development is Mozilla's core competency. One which they've abandoned.
They obviously haven't abandoned it. It's not like they cancelled Gecko development too and are rebasing on top of Blink. Again, this is all just a philosophical debate over whether rewrites or refactors are more effective when it comes to the most core component of the browser.
[0] https://github.com/mozilla/standards-positions/pull/1064
Do you see those red and orange and green pie slices? 40% of the code. There, be memory errors. Approximately 70% of all errors in that code will be memory safety related and exploitable.
Fixing it looks like developing Servo.
Don't want to take my word for it? How about the US Department of Defense: https://media.defense.gov/2022/Nov/10/2003112742/-1/-1/0/CSI...
I'm not responding further until you actually read and understand what I'm saying instead of flailing at a strawman.
A rewrite is the only way to convert the codebase to Rust or any other memory safe language. Whether that happens in parallel, piecemeal, or both at the same time is down to how well you use a version control system and structure your code. As has already been shown by sharing Servo code with Firefox.
A full rewrite is particularly useful with Rust, as the language wants you to structure your code differently than most C/C++ is structured. Doesn't make sense not to have one going if that's the plan. If you're going to rewrite the whole thing anyway, might as well do it in an idiomatic way.
But just to offer another point, I also still run into memory leaks and other performance issues in long-lived Firefox processes which, based on my experience with Rust, would be unlikely to be a problem in a functional Servo. It'd be nice to have a browser I don't have to occasionally kill and restart just to keep Youtube working.
> based on my experience with Rust
This suggests that you haven't encountered references cycles at your level of experience:
https://doc.rust-lang.org/book/ch15-06-reference-cycles.html...
Their wording was "would be unlikely", rather than "don't happen". The affine(ish) type system, along with lifetimes, makes it so most objects have a single owner, and that owner always knows it is safe to deallocate.
> for some reason seems to define memory leaks as safe behavior
The reason is that Rust aims to prevent undefined behavior, and that is the only thing it defines as unsafe behavior.
Memory leaks cannot cause a program to, for example, start serving credit cards or personal information to an attacker. Their behavior is well defined (maybe over-complicated thanks to Linux's overcommit, but still well defined).
Rust does not protect against DoS attacks in any way. In fact it seems to enjoy DoSing itself quite a lot given how many things panic in the standard library.
"Rust’s memory safety guarantees make it difficult, but not impossible, to accidentally create memory that is never cleaned up (known as a memory leak)."
My sentiment exactly. Rust makes it difficult to accidentally create memory leaks. If you try hard to do it, it's definitely possible. But it's tremendously more difficult to accomplish than in C/C++ where it accidentally happens all the time.
This makes it so that things that appear dangerous in C++ are safe in Rust, so for example instead of defensively allocating std::string to store strings (because who knows what might happen with the original string), Rust can just keep using the equivalent of std::string_view until it becomes obvious that it's no longer possible.
Things You Should Never Do, Part I [0]
[0]: https://www.joelonsoftware.com/2000/04/06/things-you-should-...
0: https://en.wikipedia.org/wiki/Netscape_Navigator
Certainly not on performance. On safety you have a chance because bugs happen.
https://www.youtube.com/watch?v=BTURkjYJ_uk
They should just keep launching bookmarking and vpn services that might make money RIGHT NOW.
Meanwhile daily driving gecko becomes a worse experience by the hour.
I'm trying Firefox on Android at the moment and it's noticeably less snappy than Chrome. I wonder if Servo would have changed that.
To be clear, I am not trying to claim you are wrong! It is the common wisdom that chrome is faster on android. But I swear, scrolling and page loading just feels faster on firefox. My only guess is the adblocker, but I think firefox is faster than brave, so who knows.
I wonder if other have a similar experience and can shed some light on the situation?
Firefox is probably faster for ad-heavy sites, but it definitely isn't for sites without obtrusive ads.
But on the occasional times I launch it (chrome), it just feels like it bogus down more often doing basic things. Someone once suggested that, paradoxically, it is slower because I don't use it very often. Something to do with ART and how the AOT compilation work on android
Mozilla decided that replacing Gecko as-is was not reasonably likely to actually happen, and that further efforts towards Servo would be better made by continuing to evolve Gecko.
Building these kinds of apps was commonplace in the 90s/early 2000s: photo editing apps, word processors, IDEs, 3D modeling software etc. Maybe RDBMS count as well.
In practice Rust is mostly used by web people to gain clout - rewriting microservices, which are usually <10k, but very rarely above 100k LOC, and were originally written in a very slow language, such as Python or Ruby.
Had these projects started out in an uncool, but performant language, like Java, there'd have been very little reasonable justification for these Rust rewrites.
On the performance side, raw processing power is not that important as you are waiting for io all the time: the main difference is not having GC and GC spikes (if you are working at significant scale) and lower memory usage all around.
Complete hogwash.
[1]: https://www.joelonsoftware.com/2000/04/06/things-you-should-...