For me a perfect web browser is something hackable, made of tiny replaceable blocks (buzzword: microservices) as small as realistically possible, not a single binary. Everything I saw are basically monoliths. Even Surf and uzbl are, except for they move tab management and UIs away.
Would, e.g., persistence (cookies) be handled by a completely separate program, that just talks text-based protocol on the stdin/stdout, all sort of stuff (containers/multiple identities, self-destructing cookies, various tracking protection mechanisms, etc etc) would be a matter of wrapping it.
Would all networking be done by a separate process, adblocking (even in-websocket), experimental protocols (like IPFS), ensuring all the traffic would go through Tor, or just "user agent switcher" would be possible without core being even aware such things are possible. Caching could be an another program that facades this.
And non-core stuff, like history, bookmarks, saving passwords or syncing feels like something that should be separate programs that only interact with the core to obtain or inject the necessary information.
Of course, there needs to be a manager process that spawns necessary services (including extensions sitting in the middle, so it's also an extension manager, not just tab manager) and routes communications between them, so user doesn't have to do any programming unless they actually want to.
That's not a design, just a general idea^W^W some musing that mostly came out of frustration when I've tried to hack upon Firefox code, trying to rip out their accounts+sync implementation and replace it with something sane and simple.
I'm dreaming of this kind of thing for a few years, but haven't got anywhere - just greased some paper with silly diagrams and toyed with Servo a little bit, but nothing practical. Just writing this because I never saw similar suggestions and maybe it would resonate with someone.
Yes, I was thinking microservices too. And composeability (the unix philosophy), any way to break up theses monolithic beasts into more manageable pieces. I'd really love if we could define some standard data structure and protocols.
I'm working on a threaded-prototype of netrunner where messages are passed between each other and could even be move out to IPC or even the network.
Also why I'm thinking this project is more like electron. And a browser could be built on top of that.
We've had a couple webkit developers tell us how hard it was to hijack some seemingly simple behavior. Interesting to hear that Firefox is the same way.
Chrome is already built this way - tons of independent services running in separate processes or threads with limited permissions. The problem is that all the systems are deeply interdependent, so debugging or modifying the system involves reading and understanding the whole codebase.
They are doing it from scratch.. well, good luck with that.
Unless of course you just want to render private custom tailored stuff and dont care about the rest of the Web.
If they want to be adventurous, why at least not helping a project like Servo?
Now, this would make a lot of sense, if they will use this in a custom infrastructure. If you want to explore Tor or Bitcon in a particular way.. where you know sites will be made to be rendered in this particular browser.
Otherwise, its a multi-multi year project just to catch up where the big guys were 8 years ago. Its great as a learning path, but if you are expecting this to be the next Firefox..
Our original lead developer has left to implement a browser using servo.
My goals are a little different. I want this to be a browsers by devs for devs. And I'd rather learn how to make one by exploration instead of copying.
No expectations of this being a good browser, just hoping for the most dev friendly and we'll see how far we can take it.
> And I'd rather learn how to make one by exploration instead of copying.
> No expectations of this being a good browser, just hoping for the most dev friendly and we'll see how far we can take it.
Nice. So your primary goals and expectations are well balanced.
So giving you guys are doing this, you should maybe consider doing some exploratory goals, taking some different paths and decisions from the current browsers.
That may lead to some good innovations along the way.
Well, but even than.. you will need to start focusing on a multi-threaded-compositor.. to render the chrome, visual dom nodes, images, video images, animations, etc..
Of course you dont need to start with all that from the beginning but at least you need to get the architecture right
Than on top of the compositor, you will need a web engine, to form the nodes and render it back on the compositor.
I guess if you just do a immediate mode render engine on with a OpenGL backend, you can render something, but it will be pretty limited, mostly for static content.
So even if its a experimental thing like.. 'i want to use lua instead of javascript for scripting..' or lisp as i've seen in the comments.
You would need to focus on the compositor first, and then create the web engine on top. But why to start from scratch if you can just reuse some chrome parts for instance (giving this is in C++).. just take the compositor from chrome in 'src/cc'.. just reuse 'src/base', 'src/crypto', 'src/net', 'src/ipc', 'src/gpu' some parts of 'src/ui' (if you want to) and you are good to go.
Then you can just focus on creating a new innovative web engine.
For instance im using the chrome compositor, to do multi-platform ui rendering using Swift in the UI API layer, and its working pretty good.
Anyway, its not wrong at all, to start like this.. its cool.. the hacker spirit, right? But you should just know that unless you are expecting to give a lot of good years of your life, to make it really good, you must have more humble expectations about what you will end with.
(And you should consider yourself lucky if you end with a equivalent of netscape 2.0 in a year if you work on it full time)
I actually have an offline repo with multithreading started but put on hold to focus on other issues. I'd like to implement threading sooner than later as it will be easier to do on a smaller project than a large one.
It's all about the architecture. We want to get that right first and are willing to throw away our current code to make sure we have this nailed. We are focusing on the renderer using a flow-programming style (only update what needs to change) as much as possible. So we're yielded to the OS unless there are pending events (using glfw). So no render() 60 times per second (unless there was something pumping events as such)
We're actually going to be converting HTML/CSS into JavaScript and the renderer will be handling a dynamic DOM. So we're not falling into trap of expecting things to be static. A couple of us are ex-game devs and have worked on HTML 5 games.
> Well, but even than.. you will need to start focusing on a multi-threaded-compositor.. to render the chrome, visual dom nodes, images, video images, animations, etc..
Why?
If you simply don't implement video/audio/scripting, nor multi-threaded anything, you can still render a large percentage of pages on Tor hidden services. That's because the sites built to work with a privacy overlay are already themselves extremely limited in complexity. And the ones that aren't frankly aren't safe to view anyway, at least not without NoScript turned on (in which case you're essentially back to HTML 4.01).
Such a browser actually has a chance of working with a lower risk of exploits than the currently available browsers. (With the downside that an "off-brand" browser stands out in the crowd wrt fingerprinting, but that's true no matter what you implement.)
But if the goal is to render not just hidden service content, but arbitrary content including normal web pages fetched through a Tor exit node, a project like this written in a language like C++ will only ever be riskier and easier to exploit than current browsers. All you could ever ethically claim to potential users is that it's not Firefox nor Chrome (and even then that's certainly no reason to prefer it in a privacy overlay).
Of course you wont do all of those things from the start. But you write a bare minimum to render static surfaces on the screen.
Its not that of a big deal on the start.. just expose a 'Layer' object that can represent a surface on the backend (eg. opengl), and use a backend thread for the rendering part (rendering the current layer tree state using display lists for instance). Than the chrome engine layer and the web engine can just use that representation of a 2d surface to render.
But at least you get the architecture right from the start. Every project i know of, always want to aim to something more later. And if you dont get the architecture right from the start, and you need to advance to goal X, you will end throwing everything you have done before, and restarting from the scratch.
I think the title is a bit misleading. Nowhere on the linked page does it actually say "anonymous" other than that several people on 4chan's /g/ board who don't use names/tripcodes/whatever-the-term-is - "anons" - asked about it.
That being said, a full browser from scratch is a pretty cool idea. I don't think that it'll get to the point where I'd be fully able to replace my "normal" browser with it, but really cool nonetheless.
Title is definitely misleading. Nor do I think the project will stay under the netrunner name as there's a distro with the same name.
The goal now isn't a browser but a framework that is easily modified, well documented and can be embedded in other projects.
We realize how much work a real browser is but if you look at the source of WebKit, blink or servo, it's a huge mess. We're trying to be low dependency and multiple interface, perfect for scripting.
Our web page targets are simple just 4chan/g and stackoverflow. But we also have plans where this framework can also be used for document viewing and editing.
I'm thinking of the project as more of an electron replacement. But we'll see where the future takes us.
Tabbed browsing by the end of this month is on track.
I am not sure why you feel Servo is a huge mess. I have worked on all of Gecko, WebKit, Blink, Servo, and Servo is distinctly cleaner. (I think it's mostly because it's newer.) I guess it is still huge.
We've updated the title from “NetRunner: a web browser for anonymous” to one derived from a representative phrase from the article, but we can change it again if someone suggests a better one.
For the record, posts saying "title is bad" are very confusing if they don't include a copy of the title they're criticising. Later, when the title is changed to something better, they make no sense at all, and it people waste time trying to figure it what the post is talking about.
Currently, the word "anonymous" is not part of the title. I guess it was at some point. I don't know.
I recently started using Qutebrowser[1] on OpenBSD, and it's really, really good. There's a learning curve but it's well documented, and help's easily available. If you use vi, you'll feel right at home.
Yes, we've been in contact with the dev of that project. Some of 4chan's complaints with it are as follows:
- a ton of dependencies
- randomly hangs
- can't handle suspending
- cursor blinks when not in insert mode
- doesn't work
- Python not C
- QT
- Lacked security features
I'm not sure if all of these are true but this list was their complaints and you'd have to ask them for more detail. A lot of them do sound like they're fixable or just personal preference.
4chan has been discussing options since mid-June and probably has around 50 threads revolving around Netrunner project. And the current devs are not the same people that started the initiative, so the project has evolved quite a bit.
Netsurf, is another project that comes up quite a bit: http://www.netsurf-browser.org/
As well as various forks of the major browsers such as palemoon, icecat, unChromium, ungoogled-chromium, waterfox, opera 12.x, vivaldi, and otter.
That'd be me, most of those aren't really true, or not qutebrowser's fault ;-)
> a ton of dependencies
Only Qt and a handful of small Python libraries. Granted, Qt is big.
> randomly hangs
> can't handle suspending
Detailed bugreports or it didn't happen - but that's probably due to people using it with an old QtWebKit version. The latest release warns about that, and v1.0 (coming later this year) will drop support for it entirely.
> cursor blinks when not in insert mode
True - there's an issue open about it, but nobody found a way to make it not blink without other side-effects so far.
> doesn't work
> Python not C
> QT
¯\_(ツ)_/¯
> Lacked security features
That's mostly referring to missing NoScript/uMatrix-like features, which will come with per-domain settings later this year.
I've used Vimium for some months myself, and wasn't really happy with it. The reasons why mainly boil down to how Vimium is quite limited in what it can do. For example:
- It can't change the user interface at all - qutebrowser has a much more minimal UI.
- It can't spawn external processes. In qutebrowser, you can simply hit ctrl-e while editing some text input, to edit it in e.g. Vim. Or you can use ":bind ,v spawn mpv {url}" to add a keybinding which spawns mpv with the current page, to watch YouTube videos in a real video player.
- As soon as you are on some special page (like the Chrome extension store, or the "new tab" page), it stops working, because it can't intercept keypresses there.
- In general, qutebrowser is much more configurable and extensible. You can easily integrate it with shell or Python scripts via userscripts, and soon there'll be a Python plugin API as well.
Probably referring to the move to web extensions. Borrowing from TylerDMozilla [1]:
"The current way add-ons are developed gives add-ons complete control over almost anything in the browser. This makes for very powerful add-ons, but add-ons can also do really bad things (accidentally or on purpose) and we can't make major changes to the browser without breaking all sorts of add-ons (which makes people sad).
Web Extensions is a sort of building block set. It means add-ons can't touch anything in the browser, but can only play with the blocks we provide. We can make all sorts of blocks of different shapes, but it will never be as powerful as the old system of add-ons (where developers could play with anything in the house). However, this lets us do major changes to Firefox without having to worry about breaking add-ons (since we know what the blocks are), and keeps add-ons from doing bad things (accidentally or on purpose)"
I would just like to point out that I have used numerous privacy extensions for firefox ever since I started using it. I've happily found WebExtensions replacements for all of them except for HTTPS Everywhere, which AFAIK is developing a WE replacement.
For me, the WebExtension transition isn't all doom and gloom. But I can't assume everyone uses the same addons as me. The transition to WebExtensions is nearing and a lot of extensions people use just don't have a viable replacement.
Had you used Self-Destructing Cookies and if yes - have you, by chance, found any replacement?
I only know about Cookie AutoDelete, and it's great that it exists but, sadly, it's not yet mature enough - mostly because WE APIs are lacking. For example, there is no localStorage support, and there are bugs like https://github.com/mrdokenny/Cookie-AutoDelete/issues/83
The developer [odilitime] is posting in this thread, but all their posts have been killed by down voting. Turn on 'show dead posts' and you'll see that they are all reasonable.
HN becoming is an echo chamber where posting while green will get you deleted for no real reason, as I've said in the past.
> ...but all their posts have been killed by down voting.
That's not what happened here. Those comments were killed by anti-abuse software that's particularly sensitive to new accounts. When false positives like that happen, we rely on the community to vouch for them (you have to click on the timestamp of the comment to go to the comment page and then click 'vouch') or email us at hn@ycombinator.com so we can be sure to see them.
We've unkilled these and marked the account so this shouldn't happen again.
Since it took me a minute to figure it out (being new), you can turn on that option by clicking on your username on the top right. (I expected it to be a thread option)
Would, e.g., persistence (cookies) be handled by a completely separate program, that just talks text-based protocol on the stdin/stdout, all sort of stuff (containers/multiple identities, self-destructing cookies, various tracking protection mechanisms, etc etc) would be a matter of wrapping it.
Would all networking be done by a separate process, adblocking (even in-websocket), experimental protocols (like IPFS), ensuring all the traffic would go through Tor, or just "user agent switcher" would be possible without core being even aware such things are possible. Caching could be an another program that facades this.
And non-core stuff, like history, bookmarks, saving passwords or syncing feels like something that should be separate programs that only interact with the core to obtain or inject the necessary information.
Of course, there needs to be a manager process that spawns necessary services (including extensions sitting in the middle, so it's also an extension manager, not just tab manager) and routes communications between them, so user doesn't have to do any programming unless they actually want to.
That's not a design, just a general idea^W^W some musing that mostly came out of frustration when I've tried to hack upon Firefox code, trying to rip out their accounts+sync implementation and replace it with something sane and simple.
I'm dreaming of this kind of thing for a few years, but haven't got anywhere - just greased some paper with silly diagrams and toyed with Servo a little bit, but nothing practical. Just writing this because I never saw similar suggestions and maybe it would resonate with someone.
I'm working on a threaded-prototype of netrunner where messages are passed between each other and could even be move out to IPC or even the network.
Also why I'm thinking this project is more like electron. And a browser could be built on top of that.
We've had a couple webkit developers tell us how hard it was to hijack some seemingly simple behavior. Interesting to hear that Firefox is the same way.
I'd love to compare diagrams.
Unless of course you just want to render private custom tailored stuff and dont care about the rest of the Web.
If they want to be adventurous, why at least not helping a project like Servo?
Now, this would make a lot of sense, if they will use this in a custom infrastructure. If you want to explore Tor or Bitcon in a particular way.. where you know sites will be made to be rendered in this particular browser.
Otherwise, its a multi-multi year project just to catch up where the big guys were 8 years ago. Its great as a learning path, but if you are expecting this to be the next Firefox..
My goals are a little different. I want this to be a browsers by devs for devs. And I'd rather learn how to make one by exploration instead of copying.
No expectations of this being a good browser, just hoping for the most dev friendly and we'll see how far we can take it.
> No expectations of this being a good browser, just hoping for the most dev friendly and we'll see how far we can take it.
Nice. So your primary goals and expectations are well balanced.
So giving you guys are doing this, you should maybe consider doing some exploratory goals, taking some different paths and decisions from the current browsers.
That may lead to some good innovations along the way.
Of course you dont need to start with all that from the beginning but at least you need to get the architecture right
Than on top of the compositor, you will need a web engine, to form the nodes and render it back on the compositor.
I guess if you just do a immediate mode render engine on with a OpenGL backend, you can render something, but it will be pretty limited, mostly for static content.
So even if its a experimental thing like.. 'i want to use lua instead of javascript for scripting..' or lisp as i've seen in the comments.
You would need to focus on the compositor first, and then create the web engine on top. But why to start from scratch if you can just reuse some chrome parts for instance (giving this is in C++).. just take the compositor from chrome in 'src/cc'.. just reuse 'src/base', 'src/crypto', 'src/net', 'src/ipc', 'src/gpu' some parts of 'src/ui' (if you want to) and you are good to go.
Then you can just focus on creating a new innovative web engine.
For instance im using the chrome compositor, to do multi-platform ui rendering using Swift in the UI API layer, and its working pretty good.
Anyway, its not wrong at all, to start like this.. its cool.. the hacker spirit, right? But you should just know that unless you are expecting to give a lot of good years of your life, to make it really good, you must have more humble expectations about what you will end with.
(And you should consider yourself lucky if you end with a equivalent of netscape 2.0 in a year if you work on it full time)
I actually have an offline repo with multithreading started but put on hold to focus on other issues. I'd like to implement threading sooner than later as it will be easier to do on a smaller project than a large one.
It's all about the architecture. We want to get that right first and are willing to throw away our current code to make sure we have this nailed. We are focusing on the renderer using a flow-programming style (only update what needs to change) as much as possible. So we're yielded to the OS unless there are pending events (using glfw). So no render() 60 times per second (unless there was something pumping events as such)
We're actually going to be converting HTML/CSS into JavaScript and the renderer will be handling a dynamic DOM. So we're not falling into trap of expecting things to be static. A couple of us are ex-game devs and have worked on HTML 5 games.
Thanks for the kind words.
Thats a cool idea. Than its just a matter of exposing the c++ composition/rendering engine to javascript.
But of course, this will require a pretty hardcore javascript JIT like the current V8..
Why?
If you simply don't implement video/audio/scripting, nor multi-threaded anything, you can still render a large percentage of pages on Tor hidden services. That's because the sites built to work with a privacy overlay are already themselves extremely limited in complexity. And the ones that aren't frankly aren't safe to view anyway, at least not without NoScript turned on (in which case you're essentially back to HTML 4.01).
Such a browser actually has a chance of working with a lower risk of exploits than the currently available browsers. (With the downside that an "off-brand" browser stands out in the crowd wrt fingerprinting, but that's true no matter what you implement.)
But if the goal is to render not just hidden service content, but arbitrary content including normal web pages fetched through a Tor exit node, a project like this written in a language like C++ will only ever be riskier and easier to exploit than current browsers. All you could ever ethically claim to potential users is that it's not Firefox nor Chrome (and even then that's certainly no reason to prefer it in a privacy overlay).
Its not that of a big deal on the start.. just expose a 'Layer' object that can represent a surface on the backend (eg. opengl), and use a backend thread for the rendering part (rendering the current layer tree state using display lists for instance). Than the chrome engine layer and the web engine can just use that representation of a 2d surface to render.
But at least you get the architecture right from the start. Every project i know of, always want to aim to something more later. And if you dont get the architecture right from the start, and you need to advance to goal X, you will end throwing everything you have done before, and restarting from the scratch.
https://limpet.net/mbrubeck/2014/08/08/toy-layout-engine-1.h...
It walks through the basics of writing a layout engine which is extended over a series of posts to be quite flexible and interesting.
That being said, a full browser from scratch is a pretty cool idea. I don't think that it'll get to the point where I'd be fully able to replace my "normal" browser with it, but really cool nonetheless.
The goal now isn't a browser but a framework that is easily modified, well documented and can be embedded in other projects.
We realize how much work a real browser is but if you look at the source of WebKit, blink or servo, it's a huge mess. We're trying to be low dependency and multiple interface, perfect for scripting.
Our web page targets are simple just 4chan/g and stackoverflow. But we also have plans where this framework can also be used for document viewing and editing.
I'm thinking of the project as more of an electron replacement. But we'll see where the future takes us.
Tabbed browsing by the end of this month is on track.
I'd love to see that. I think an Electron replacement would be really good.
Currently, the word "anonymous" is not part of the title. I guess it was at some point. I don't know.
[1] - https://qutebrowser.org/
- a ton of dependencies - randomly hangs - can't handle suspending - cursor blinks when not in insert mode - doesn't work - Python not C - QT - Lacked security features
I'm not sure if all of these are true but this list was their complaints and you'd have to ask them for more detail. A lot of them do sound like they're fixable or just personal preference.
4chan has been discussing options since mid-June and probably has around 50 threads revolving around Netrunner project. And the current devs are not the same people that started the initiative, so the project has evolved quite a bit.
Netsurf, is another project that comes up quite a bit: http://www.netsurf-browser.org/ As well as various forks of the major browsers such as palemoon, icecat, unChromium, ungoogled-chromium, waterfox, opera 12.x, vivaldi, and otter.
> a ton of dependencies
Only Qt and a handful of small Python libraries. Granted, Qt is big.
> randomly hangs > can't handle suspending
Detailed bugreports or it didn't happen - but that's probably due to people using it with an old QtWebKit version. The latest release warns about that, and v1.0 (coming later this year) will drop support for it entirely.
> cursor blinks when not in insert mode
True - there's an issue open about it, but nobody found a way to make it not blink without other side-effects so far.
> doesn't work > Python not C > QT
¯\_(ツ)_/¯
> Lacked security features
That's mostly referring to missing NoScript/uMatrix-like features, which will come with per-domain settings later this year.
I've used Vimium for some months myself, and wasn't really happy with it. The reasons why mainly boil down to how Vimium is quite limited in what it can do. For example:
- It can't change the user interface at all - qutebrowser has a much more minimal UI.
- It can't spawn external processes. In qutebrowser, you can simply hit ctrl-e while editing some text input, to edit it in e.g. Vim. Or you can use ":bind ,v spawn mpv {url}" to add a keybinding which spawns mpv with the current page, to watch YouTube videos in a real video player.
- As soon as you are on some special page (like the Chrome extension store, or the "new tab" page), it stops working, because it can't intercept keypresses there.
- In general, qutebrowser is much more configurable and extensible. You can easily integrate it with shell or Python scripts via userscripts, and soon there'll be a Python plugin API as well.
I must have missed something. What recent changes?
"The current way add-ons are developed gives add-ons complete control over almost anything in the browser. This makes for very powerful add-ons, but add-ons can also do really bad things (accidentally or on purpose) and we can't make major changes to the browser without breaking all sorts of add-ons (which makes people sad).
Web Extensions is a sort of building block set. It means add-ons can't touch anything in the browser, but can only play with the blocks we provide. We can make all sorts of blocks of different shapes, but it will never be as powerful as the old system of add-ons (where developers could play with anything in the house). However, this lets us do major changes to Firefox without having to worry about breaking add-ons (since we know what the blocks are), and keeps add-ons from doing bad things (accidentally or on purpose)"
[1] https://www.reddit.com/r/firefox/comments/6cw7ig/why_is_fire...
So safer for the end user, better for the Mozilla developers but not so powerful for the 3rd party developers.
For me, the WebExtension transition isn't all doom and gloom. But I can't assume everyone uses the same addons as me. The transition to WebExtensions is nearing and a lot of extensions people use just don't have a viable replacement.
I only know about Cookie AutoDelete, and it's great that it exists but, sadly, it's not yet mature enough - mostly because WE APIs are lacking. For example, there is no localStorage support, and there are bugs like https://github.com/mrdokenny/Cookie-AutoDelete/issues/83
https://en.wikipedia.org/wiki/Comparison_of_web_browsers
irc.rizon.net #/g/netrunner
Going to take a bit to get my head around it.
HN becoming is an echo chamber where posting while green will get you deleted for no real reason, as I've said in the past.
https://postimg.org/image/q2qu8uc1v/
https://postimg.org/image/4w0ln83yb/
That's not what happened here. Those comments were killed by anti-abuse software that's particularly sensitive to new accounts. When false positives like that happen, we rely on the community to vouch for them (you have to click on the timestamp of the comment to go to the comment page and then click 'vouch') or email us at hn@ycombinator.com so we can be sure to see them.
We've unkilled these and marked the account so this shouldn't happen again.