It's the closest thing to a Unix successor we ever got, taking the "everything is a file" philosophy to another level and allowing to easily share those files over the network to build distributed systems. Accessing any remote resources is easy and robust on Plan9, meanwhile on other systems we need to install specialized software with bad interoperability for each individual use case.
Plan9 also had some innovative UI features, such as mouse chording to edit text, nested window managers, the Plumber to run user-configurable commands on known text patterns system-wide, etc.
Its distributed nature should have meant it's perfect for today's world with mobile, desktop, cloud, and IoT devices all connected to each other. Instead, we're stuck with operating systems that were never designed for that.
There are still active forks of Plan9 such as 9front, but the original from Bell Labs is dead. The reasons it died are likely:
- Legal challenges (Plan9 license, pointless lawsuits, etc.) meant it wssn't adopted by major players in the industry.
- Plan9 was a distributed OS during a time when having a local computer became popular and affordable, while using a terminal to access a centrally managed computer fell out of fashion (though the latter sort of came back in a worse fashion with cloud computing).
- Bad marketing and posing itself as merely a research OS meant they couldn't capitalize on the .com boom.
- AT&T lost its near endless source of telephone revenue. Bell Labs was sold multiple times over the coming years, a lot of the Unix/Plan9 guys went to other companies like Google.
- Photon, the graphical interface for QNX. Oriented more towards real time (widgets included gauges) but good enough to support two different web browsers. No delays. This was a real time operating system.
- MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.
- Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems.
- IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS.
- CPUs that really hypervise properly. That is, virtual execution environments look just like real ones.
IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks.
- The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind.
- Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.
- XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman.
Would it kill people to have to close their tags properly?
- Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate.
>- XHTML. [...] Would it kill people to have to close their tags properly?
XHTML appeals to the intuition that there should be a Strict Right Way To Do Things ... but you can't use that unforgiving framework for web documents that are widely shared.
The "real world" has 2 types of file formats:
(1) file types where consumers cannot contact/control/punish the authors (open-loop) : HTML, pdf, zip, csv, etc. The common theme is that the data itself is more important that the file format. That's why Adobe Reader will read malformed pdf files written by buggy PDF libraries. And both 7-Zip and Winrar can read malformed zip files with broken headers (because some old buggy Java libraries wrote bad zip files). MS Excel can import malformed csv files. E.g. the Citi bank export to csv wrote a malformed file and it was desirable that MS Excel imported it anyway because the raw data of dollar amounts was more important than the incorrect commas in the csv file -- and -- I have no way of contacting the programmer at Citi to tell them to fix their buggy code that created the bad csv file.
(2) file types where the consumer can control the author (closed-loop): programming language source code like .c, .java, etc or business interchange documents like EDI. There's no need to have a "lenient forgiving" gcc/clang compiler to parse ".c" source code because the "consumer-and-author" will be the same person. I.e. the developer sees the compiler stop at a syntax error so they edit and fix it and try to re-compile. For business interchange formats like EDI, a company like Walmart can tell the vendor to fix their broken EDI files.
XHTML wants to be in group (2) but web surfers can't control all the authors of .html so that's why lenient parsing of HTML "wins". XHTML would work better in a "closed-loop" environment such as a company writing internal documentation for its employees. E.g. an employee handbook can be written in strict XHTML because both the consumers and authors work at the same company. E.g. can't see the vacation policy because the XHTML syntax is wrong?!? Get on the Slack channel and tell the programmer or content author to fix it.
The problem is that group (1) results in a nightmarish race-to-the-bottom. File creators have zero incentive to create spec-compliant files, because there's no penalty for creating corrupted files. In practice this means a large proportion of documents are going to end up corrupt. Does it open in Chrome? Great, ship it! The file format is no longer the specification, but it has now become a wild guess at whatever weird garbage the incumbent is still willing to accept. This makes it virtually impossible to write a new parser, because the file format suddenly has no specification.
On the other hand, imagine a world where Chrome would slowly start to phase out its quirks modes. Something like a yellow address bar and a "Chrome cannot guarantee the safety of your data on this website, as the website is malformed" warning message. Turn it into a red bar and a "click to continue" after 10 years, remove it altogether after 20 years. Suddenly it's no longer that one weird customer who is complaining, but everyone - including your manager. Your mistakes are painfully obvious during development, so you have a pretty good incentive to properly follow the spec. You make a mistake on a prominent page and the CTO sees it? Well, guess you'll be adding an XHTML validator to your CI pipeline next week!
It is very tempting to write a lenient parser when you are just one small fish in a big ecosystem, but over time it will inevitably lead to the degradation of that very ecosystem. You need some kind of standards body to publish a validating reference parser. And like it or not, Chrome is big enough that it can act as one for HTML.
I’d argue a good comparison here is HTTPS. Everyone decided it would be good for sites to move over to serving via HTTPS so browsers incentivised people to move by gating newer features to HTTPS only. They could have easily done the same with XHTML had they wanted.
The opportunities to fix this were pretty abundant. For instance, it would take exactly five words from Google to magically make a vast proportion of web pages valid XHTML:
> - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?
We stop at the first sign of trouble for almost every other format, we do not need lax parsing for HTML. This has caused a multitude of security vulnerabilities and only makes it more difficult for pretty much everybody.
The attitude towards HTML5 parsing seemed to grow out of this weird contrarianism that everybody who wanted to do better than whatever Internet Explorer did had their head in the clouds and that the role of a standard was just to write down all the bugs.
Just to remind you that <bold> <italic> text </bold> </italic> that has been working for ages in every browser ever, is NOT a valid XHTML, and should be rejected by GP's proposal.
XHTML allows you to use XML and <bold> <italic> are just XML nodes with no schema. The correct form has been and will always be <b> and <i>. Since the beginning.
That caused plenty of incompatibilities in the past. At one point, Internet Explorer would parse that and end up with something that wasn’t even a tree.
HTML is not a set of instructions that you follow. It’s a terrible format if you treat it that way.
> MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.
You have things backwards. The Copland project was horribly mismanaged. Anybody at Apple who came up with a new technology got it included in Copland, with no regard to feature creep or stability. There's a leaked build floating around from shortly before the project was cancelled. It's extremely unstable and even using basic desktop functionality causes hangs and crashes. In mid-late 1996, it became clear that Copland would never ship, and Apple decided the best course of action was to license an outside OS. They considered options such as Solaris, Windows NT, and BeOS, but of course ended up buying NeXT. Copland wasn't killed to justify buying NeXT, Apple bought NeXT because Copland was unshippable.
Word lens team was bought by google, its far better in google translate then the local app ever was. You could repeat the old app with a local LLM now pretty easily but it still won't be as close in quality as using google translate
> Would it kill people to have to close their tags properly
It would kill the approachability of the language.
One of the joys of learning HTML when it tended to be hand-written was that if you made a mistake, you'd still see something just with distorted output.
That was a lot more approachable for a lot of people who were put off "real" programming languages because they were overwhelmed by terrible error messages any time they missed a bracket or misspelled something.
If you've learned to program in the last decade or two, you might not even realise just how bad compiler errors tended to be in most languages.
The kind of thing where you could miss a bracket on line 47 but end up with a compiler error complaining about something 20 lines away.
Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors.
But in the days of XHTML? Error messages were arcane, you had to dive in to see what the problem actually was.
If you forget a closing quote on an attribute in html, all content until next quote is ignored and not rendered - even if it is the rest of the page. I dont think this is more helpful than an error message. It was just simpler to implement.
For reference, observe what happens if you try opening this malformed document in a browser: save it with a .xhtml extension, or serve it with MIME type application/xhtml+xml.
Firefox displays naught but the error:
XML Parsing Error: mismatched tag. Expected: </b>.
Location: file:///tmp/x.xhtml
Line Number 22, Column 3:
</p>
--^
Chromium displays this banner on top of the document up to the error:
This page contains the following errors:
error on line 22 at column 5: Opening and ending tag mismatch: b line 19 and p
Below is a rendering of the page up to the first error.
Thanks for showing these. We can see Firefox matches the same style of accurate but unhelpful error message.
Chromium is much more helpful in the error message, directing the user to both line 19 and 22. It also made the user-friendly choice to render up to the error.
In the context of XHTML, we should also keep in mind that Chrome post-dates XHTML by almost a decade.
If, on the other hand, you have some sorts of XSLT errors, Firefox gives you a reasonably helpful error message in the dev tools, whereas Chromium gives you a blank document and nothing else… unless you ran it in a terminal. I’m still a little surprised that I managed to discover that it was emitting XSLT errors to stdout or stderr (don’t remember which).
Really, neither has particularly great handling of errors in anything XML. None of it is better than minimally maintained, a lot of it has simply been unmaintained for a decade or more.
- I think without the move to NeXT, even if Jobs had come back to Apple, they would never have been able to get to the iPhone. iOS was - and still is - a unix-like OS, using unix-like philosophy, and I think that philosophy allowed them to build something game-changing compared to the SOTA in mobile OS technology at the time. So much so, Android follows suit. It doesn't have a command line, and installation is fine, so I'm not sure your line of reasoning holds strongly. One thing I think you might be hinting at though that is a missed trick: macOS today could learn a little from the way iOS and iPadOS is forced to do things and centralise configuration in a single place.
- I think transaction processing operating systems have been reinvented today as "serverless". The load/execute/quit cycle you describe is how you build in AWS Lambdas, GCP Cloud Run Functions or Azure Functions.
- Most of your other ideas (with an exception, see below), died either because of people trying to grab money rather than build cool tech, and arguably the free market decided to vote with its feet - I do wonder when we might next get a major change in hardware architectures again though, it does feel like we've now got "x86" and "ARM" and that's that for the next generation.
- XHTML died because it was too hard for people to get stuff done. The forgiving nature of the HTML specs is a feature, not a bug. We shouldn't expect people to be experts at reading specs to publish on the web, nor should it need special software that gatekeeps the web. It needs to be scrappy, and messy and evolutionary, because it is a technology that serves people - we don't want people to serve the technology.
> XHTML died because it was too hard for people to get stuff done.
This is not true. The reason it died was because Internet Explorer 6 didn’t support it, and that hung around for about a decade and a half. There was no way for XHTML to succeed given that situation.
The syntax errors that cause XHTML to stop parsing also cause JSX to stop parsing. If this kind of thing really were a problem, it would have killed React.
People can deal with strict syntax. They can manage it with JSX, they can manage it with JSON, they can manage it with JavaScript, they can manage it with every back-end language like Python, PHP, Ruby, etc. The idea that people see XHTML being parsed strictly and give up has never had any truth to it.
On XHTML, I think there was room for both HTML and a proper XHTML that barks on errors. If you're a human typing HTML or using a language where you build your HTML by concatenation like early PHP, sure it makes sense to allow loosey goosey HTML but if you're using any sort of simple DOM builder which should preclude you from the possibility of outputting invalid HTML, strict XHTML makes a lot more sense.
Honestly I'm disappointed the promised XHTML5 never materialized along side HTML5. I guess it just lost steam.
But a HTML5 parser will obviously parse "strict" HTML5 just fine too, what value is there to special-case the "this was generated by a DOM builder" path client-side?
> Honestly I'm disappointed the promised XHTML5 never materialized along side HTML5. I guess it just lost steam.
The HTML Standard supports two syntaxes, HTML and XML. All browsers support XML syntax just fine—always have, and probably always will. Serve your file as application/xhtml+xml, and go ham.
> Would it kill people to have to close their tags properly?
Probably not, but what would be the benefit of having more pages fail to render? If xhtml had been coupled with some cool features which only worked in xhtml mode, it might have become successful, but on its own it does not provide much value.
> but what would be the benefit of having more pages fail to render?
I think those benefits are quite similar to having more programs failing to run (due to static and strong typing, other static analysis, and/or elimination of undefined behavior, for instance), or more data failing to be read (due to integrity checks and simply strict parsing): as a user, you get documents closer to valid ones (at least in the rough format), if anything at all, and additionally that discourages developers from shipping a mess. Then parsers (not just those in viewers, but anything that does processing) have a better chance to read and interpret those documents consistently, so even more things work predictably.
Sure, authoring tools should help authors avoid mistakes and produce valid content. But the browser is a tool for the consumer of content, and there is no benefit for the user if it fails to to render some existing pages.
It is like Windows jumping through hoops to support backwards compatibility even with buggy software. The interest of the customer is that the software runs.
if developer accidentally left opening comment at the start of the html.
Rhetorical question: Should the browser display page even if it is commented out?
There is some bar for what is expected to work.
If all browsers would consistently error out on unclosed tags, then it would definitely force developers to close tags, it would force it become common knowledge, second nature.
> there is no benefit for the user if it fails to to render some existing pages
What if the browser renders it incorrectly? If a corrupt tag combination leads to browser X parsing "<script>" as inline text but browser Y parsing it as a script tag, that could lead to serious security issues!
Blindly guessing at the original author's intent whenever you encounter buggy content is a recipe for disaster. Sometimes it is to the user's benefit to just refuse to render it.
HTML5 was the answer for the consistency part: where before browsers did different things to recover from "invalid" HTML, HTML5 standardizes it because it doesn't care about valid/invalid as much, it just describes behavior anyways.
XHTML is XML. XML-based markup for content can be typeset into PDF, suitable for print media. I invite you to check out the PDFs listed in the intro to my feature matrix comparison page, all being sourced from XHTML:
The reason XHTML failed is because the spec required it to be sent with a new MIME type (application/xml+xhtml I believe) which no webserver did out of the box. Everything defaulted to text/html, which all browsers would interpret as HTML, and given the mismatching doctype, would interpret as tag soup (quirks mode/lenient).
Meanwhile, local files with the doctype would be treated as XHTML, so people assumed the doctype was all you needed. So everyone who tried to use XHTML didn't realize that it would go back to being read as HTML when they upload it to their webserver/return it from PHP/etc. Then, when something went wrong/worked differently than expected, the author would blame XHTML.
Edit: I see that I'm getting downvoted here; if any of this is factually incorrect I would like to be educated please.
> The reason XHTML failed is because the spec required it to be sent with a new MIME type (application/xml+xhtml I believe) which no webserver did out of the box. Everything defaulted to text/html, which all browsers would interpret as HTML, and given the mismatching doctype, would interpret as tag soup (quirks mode/lenient).
None of that is correct.
It was perfectly spec. compliant to label XHTML as text/html. The spec. that covers this is RFC 2854 and it states:
> The text/html media type is now defined by W3C Recommendations; the latest published version is [HTML401]. In addition, [XHTML1] defines a profile of use of XHTML which is compatible with HTML 4.01 and which may also be labeled as text/html.
There’s no spec. that says you need to parse XHTML served as text/html as HTML not XHTML. As the spec. says, text/html covers both HTML and XHTML. That’s something that browsers did but had no obligation to.
The mismatched doctype didn’t trigger quirks mode. Browsers don’t care about that. The prologue could, but XHTML 1.0 Appendix C told you not to use that anyway.
Even if it did trigger quirks mode, that makes no difference in terms of tag soup. Tag soup is when you mis-nest tags, for instance <strong><em></strong></em>. Quirks mode was predominantly about how it applied CSS layout. There are three different concepts being mixed up here: being parsed as HTML, parsing tag soup, and doctype switching.
The problem with serving application/xhtml+xml wasn’t anything to do with web servers. The problem was that Internet Explorer 6 didn’t support it. After Microsoft won the browser wars, they discontinued development and there was a five year gap between Internet Explorer 6 and 7. Combined with long upgrade cycles and operating system requirements, this meant that Internet Explorer 6 had to be supported for almost 15 years globally.
Obviously, if you can’t serve XHTML in a way browsers will parse as XML for a decade and a half, this inevitably kills XHTML.
Yes, I covered that; everyone assumed that you only needed to specify the doctype, but in practice browsers only accepted it for local files or HTTP responses with Content-Type: application/xml+xhtml. I've edited the comment to make that more explicit.
I love this mismatched list of grievances and I find myself agreeing with most of them. XHTML and proper CPU hypervisors in particular.
People being too lazy to close the <br /> tag was apparently a gateway drug into absolute mayhem. Modern HTML is a cesspool. I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it. Is that part of the reason why we have so few browsers?
> People being too lazy to close the <br /> tag was apparently a gateway drug into absolute mayhem.
Your chronology is waaaaaaaaaaaay off.
<BR> came years before XML was invented. It was a tag that didn’t permit children, so writing it <BR></BR> would have been crazy, and inventing a new syntax like <BR// or <BR/> would have been crazy too. Spelling it <BR> was the obvious and reasonable choice.
The <br /> or <br/> spelling was added to HTML after XHTML had already basically lost, as a compatibility measure for porting back to HTML, since those enthusiastic about XHTML had taken to writing it and it was nice having a compatible spelling that did the same in both. (In XHTML you could also write <br></br>, but that was incorrect in HTML; and if you wrote <br /> in HTML it was equivalent to <br /="">, giving you one attribute with name "/" and value "". There were a few growing pains there, such as how <input checked> used to mean <input checked="checked">—it was actually the attribute name that was being omitted, not the value!—except… oh why am I even writing this, messy messy history stuff, engines doing their own thing blah blah blah, these days it’s <input checked="">.
Really, the whole <… /> thing is more an artefact of an arguably-misguided idea after a failed reform. The absolute mayhem came first, not last.
> I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it.
The HTML parser is magnificent, by far the best spec for something reasonably-sized that I know of. It’s exhaustively defined in terms of state machines. It’s huge, far larger than one would like it to be because of all this compatibility stuff, but genuinely easy to implement if you have the patience. Seriously, go read it some time, it’s really quite approachable.
Adobe Flash / Shockwave. After all these decades, I've yet to see a tool that makes it as easy to make games or multimedia as Flash did. One of many reminders recently (many others in politics) that humanity doesn't just inevitably or linearly move forward in any domain, or even 2 steps forward 1 step back. Some things are just lost to time - maybe rediscovered in a century, maybe never.
Godot is pretty awesome. Easy to learn, can do 2D or 3D, and can export to HTML5/webasm that works across all major OSes and browsers including mobile.
It’s far from perfect but I’ve been enjoying playing with it even for things that aren’t games and it has come a long way just in the last year or two. I feel like it’s close to (or is currently) having its Blender moment.
Even if Adobe had gotten their act together and fixed all security holes, Apple would have still killed it. It was always a threat as a popular design tool. And decades later, with the HTML canvas hype faded, there's still no replacement to what Adobe Flash could do - any designer could create stellar, interactive design that can be embedded into any website...without a monthly subscription.
I don't think thats the case. For the longest while flash was faster than js at doing anything vaguely graphic based. The issue for apple was that the CPU in the iphone wasn't fast enough to do flash and anything else. Moreover Adobe didn't get on with jobs when they were talking about custom versions.
You have to remember that "apps" were never meant to be a thing on the iphone, it was all about "desktop" like web performance.
Flash performance is still better than current web stack's. Probably will always be - you could write non trivial games that would work on 128MB memory machine. Currently single browser tab with simple page can take more than that.
Performance was way better than what we have now with modern web stacks, we just have more powerful computers.
I agree on security and bugs, but bugs can be fixed. It just shows neglect by Adobe, which was, I think, the real problem. I think that if Adobe seriously wanted to, it could have been a web standard.
Macromedia Fireworks was an outstanding piece of software.
The 20 most common things you’d do with the tool were there for you in obvious toolbars. It had a lot of advanced features for image editing. It had a scripting language, so you could do bulk editing operations. It supported just about every file extension you could think of.
Most useful feature of all was that it’d load instantly. You’d click the icon on the desktop, and there’d be the Fireworks UI before you could finish blinking. Compared to 2025 Adobe apps, where you click the desktop icon and make a coffee while it starts, it’s phenomenal performance.
Enabling novice normies to make games was excellent, and I believe the whole game industry benefited from this resulting injection of fresh ideas. A lot of indy developers with fresh takes on what games could be got started this way. Zachtronics is one example of many that comes to mind right now.
On the other hand, for every flash game made there were about ten thousands flash-based ads, and nearly as many websites that used flash poorly for things like basic navigation (remember flash based website dropdown menus?). And for a few years it seemed like every single restaurant with a website was using flash for the entire thing, the results were borderline unusable in the best cases. And let's not forget that as long as flash was dominant, it was choking out the demand to get proper video support into browsers. Flash based video players performed like dog shit and made life on Linux a real chore.
Personal pet peeve, but as someone who still makes gifs, Image Ready. Adobe kind of absorbed Image Ready into Photoshop and it's just never lived up to how easy it was to make simple gifs in Image Ready
Yes. I never used flash personally, but I loved those little games people created with them. There was the whole scene of non developers creating little games of all kinds and it just ceased to exist.
Ruffle is amazing. I launched a 20+ year old game yesterday with zero compatibility issues. Even better than the original Flash because of superior security isolation mechanisms.
Kids now create games in Roblox.
More constrained, more commercial, more exploitative- but there is still a huge scene of non-developers creating games if you care to look.
The Ricochet network. A packet mesh network providing ISDN speeds in the dialup era, wirelessly.
They burned through $5B of 1999 dollars, building out a network in 23 cities, and had effectively zero customers. Finally shut down in 2001.
All their marketing was focused on "mobile professionals", whoever those were, while ignoring home users who were clamoring for faster internet where other ISPs dragged their feet.
Today, 5G femtocells have replicated some of the concept (radically small cell radius to increase geographic frequency reuse), but without the redundancy -- a femtocell that loses its uplink is dead in the water, not serving as a relay node. A Ricochet E-radio that lost its uplink (but still had power) would simply adjust its routing table and continue operating.
Edit: you asked why. I first saw it at SELF where Chris DiBona showed it to me and a close friend. It was awesome. Real time translation, integration of various types of messaging, tons of cool capabilities, and it was fully open source. What made it out of Google was a stripped down version of what I was shown, the market rejected it, and it was a sad day. Now, I am left with JIRA, Slack, and email. It sucks.
I was blown away by the demo but then after I thought about it, it seemed like a nightmare to me. All the problems of slack of having to manually check channels for updates except X 100 (yea, I get that slack wasn't available then. My point is I saw that it seemed impossible to keep up with nested constantly updated hierarchical threads. Keeping up with channels on slack is bad enough so imagine if Wave had succeeded. It'd be even worse.
Wave was great for conversation with one or two other people on a specific project, which I'm sure most people here used it for. I can't imagine it scaling well beyond that.
I managed trips with friends and it was a great form factor for ad-hoc discussions with docs and links included. I thought it was the future and in my very early programming days wrote probably the most insecure plugin ever to manage your servers.
Google wave was built on an awesome technology layer, and they they totally blew in on the user interface.... deciding to treat it as a set of separate items instead of a single document everyone everywhere all at once could edit.... killed it.
It make it seem needlessly complicated, and effectively erased all the positives.
Google Wave had awesome tech but if you look at the demo in hindsight you can tell it’s just not a very good product. They tried making an all-in-one kind of product which just doesn’t work.
In a sense Wave still exists but was split into multiple products, so I wouldn’t say it’s “dead”. The tech that powered it is still used today in many of Google’s popular products. It turns out that having separate interfaces for separate purposes is just more user friendly than an all-in-one.
Even the watered-down version of wave was something I used at my host startup, it was effectively our project management tool. And it was amazing at that.
I don't know how it would fare compared to the options available today, but back then, it shutting down was a tremendous loss.
It's indeed not a good one. Discord refined instant messaging and bolts other things on top like forums but isn't fundamentally different. Google Wave was (and still is) a completely different paradigm. Everything was natively collaborative: it mixed instant messaging with document edition (like Google Docs or pads) and any widget you could think of (polls, calendars, playing music, drawing, ...) could be added by users through sandboxed Javascript. The current closest I can think of is DeltaChat's webxdc.
Google sucked/s at executive function because they completely lack appreciation for proper R&D and long-term investment and also kill things people use and love.
Yep. And rather than ask people, focus group, or look at the evidence, they just guess or do whatever they want. Not much leadership or community engagement appears to be involved.
Optane persistent memory had a fascinating value proposition: stop converting data structures for database storage and just persist the data directly. No more booting or application launch or data load: just pick up where you left off. Died because it was too expensive, but probably long after it should have.
VM's persist memory snapshots (as do Apple's containers, for macOS at least), so there's still room for something like that workflow.
The world had already caught up. By the time it was released, flash memory was already nearing it's speed and latency, to the point that the difference want with the cost.
>flash memory was already nearing it's speed and latency
Kinda, but for small writes it's still nowhere near.
Samsung 990 Pro - IOPS 4KQD1 113 MBytes/Sec
P4800X optane - IOPS 4KQD1 206 MBytes/Sec
And that's a device 5 years newer and on a faster pcie generation.
It disappeared because the market that values above attribute is too small and its hard to market because at first glance they look about the same on a lot of metrics as you say
Systems are stuck in old ways in how they model storage, so they weren't ready for something that is neither really RAM nor disk.
Optane did inspire quite a few research projects for a while though. A few applications emerged in the server space, in particular.
How does that work? It loads kernel from drive to ram?
Isn't windows fast boot something like that (only slower, depending on ssd)? It semi-hibernates, stores kernel part of memory on disk for faster startup.
This one would have behaved more like suspend to RAM. In suspend to RAM, the RAM is kept powered, while everything else is shut down. The recovery would be near instant, since all the execution contexts are preserved on the RAM.
Optane was nearly as fast as RAM, but also persistent like a storage device. So you do a suspend to RAM, without the requirement to keep it powered like a RAM.
Not only because of price. The 'ecosystem' infrastructure wasn't there, or at least not spread wide enough. The 'mindshare'/thinking of ways how to do, neither. This is more aligned with (live) 'image-based' working environments like early Lisp and Smalltalk systems. Look at where they are now...
A few more thoughts about that, since I happen to have some of the last systems who actually had systems level support for that in their firmware, and early low-capacity optanes designed for that sort of use. It's fascinating to play with these, but they are low capacity, and bound to obsolete operating systems.
Given enough RAM, you can emulate that with working suspend and resume to/and from RAM.
Another avenue are the ever faster and larger SSDs, in practice, with some models it makes almost no difference anymore, since random access times are so fast, and transfer speeds insane. Maybe total and/or daily TBW remains a concern.
Google Reader: I will forever be salty about how Google killed something that likely required very little maintenance in the long run. It could have stayed exactly the same for a decade and I wouldn't have cared because I use an RSS reader exactly the same way I do that I did back in 2015.
Yes. That was the single worst business decision in Google history, as somebody correctly noted. It burned an enormous amount of goodwill for no gain whatsoever.
Killing Google Reader affected a relatively small number of users, but these users disporportionately happened to be founders, CTOs, VPs of engineering, social media luminaries, and people who eventually became founders, CTOs, etc. They had been painfully taught to not trust Google, and, since that time, they didn't. And still don't.
Just think of the data mining they could have had there.
They had a core set of ultra-connected users who touched key aspects of the entire tech industry. The knowledge graph you could have built out of what those people read and shared…
They could have just kept the entire service running with, what, 2 software engineers? Such a waste.
This would require the decision-maker to think and act at the scale and in interests of the entire company. Not at the scale of a promo packet for next perf: "saved several millions in operation costs by shutting down a low-impact, unprofitable service."
There is some truth in this. I fit into a few of these buckets and I don’t think I could ever recommend their enterprise stuff after having my favourite consumer products pulled.
> Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.
You can argue whether it's as good as GPM or not, but it's false to imply that your uploaded music disappeared when Google moved to YouTube Music. I made the transition, and all of my music moved without a new upload.
I still use PICASA it works fine. However, when google severed the gdrive-photo linking it meant my photos didn’t automatically download from google to my PC. This is what killed google for me.
Hmm, good to know. But given Google's history, I assumed that it would stop working.
I also need to sell my Google Chromecast with Google TV 4K. Brand new, still in its shrink wrap. Bought it last year, to replace a flaky Roku. It was a flaky HDMI cable instead. I trust Roku more than Google for hardware support.
In absolutely shocking news, it did stop working and then Google went out of their way to fix it.
I genuinely thought all the chromecast audios I owned were useless bricks and was looking around for replacements and then they just started working again from an OTA update. Astounding. I assume someone got fired for taking time away from making search worse to do this.
I'm still amused that they killed Google Notebook and then a few years later created Google Keep, an application with basically the same exact feature set.
You can say that for a fair few of the services mentioned by GP.
Google killed a lot of things to consolidate them into more "integrated" (from their perspective) product offerings. Picasa -> Photos, Hangounts -> Meet, Music -> YT Premium.
No idea what NFC Wallet was, other than the Wallet app on my phone that still exists and works?
The only one I'm not sure about is Chromecast - a while back my ones had an "update" to start using their newer AI Assistant system for managing it. Still works.
That was probably me, when I stopped using Google Search some years ago. :-) Got tired of the ads, the blog spam, and AI-generated content crap floating to the top of their results page.
The https://udm14.com/ flavor of Google is quite usable, though, esp with notable operators like inurl:this-or-that. But, all in all, yeah, gimme back vanilla Google search from 2008-2010 or so. Back then it was definitely a tool (I worked in investigative journalism at the time), whereas currently "searching" stands for sitting fingers crossed and hoping for the better. But, oh well. </rant>
That's more what I meant. Sure, lots of people still type stuff into the URL bar that takes them to www.google.com/search. But whatever you want to call that results page now, it's no longer Google Search in anything but name.
same can be said if you compare www.google.com search from 2012 and 2022, times are changing… I am not defending google search here, I haven’t used it except by accident in long time now but to say google search is “dying” like you often hear (especially here on HN) is a serious detachment from reality
Picasa was awesome, they had face recognition years before almost everything else, in a nice offline package.
Unfortunately the last public version has a bug that randomly swaps face tags, so you end up training on the wrong persons faces just enough to throw it all off, and the recognition becomes effectively worthless on thousands of family photos. 8(
Digikam is a weak sauce replacement that barely gets the job done.
I use this free and extremely bare bones app made by a friend: https://apps.apple.com/us/app/max-where/id1579123291. It tracks your location constantly, has a basic viewer, and lets you export to CSV. That’s about it but it’s all I need.
Check out Dawarich, it has an official iOS app and you can use a number of 3rd party mobile apps to track your data and then upload it to server: either ran on your own hardware (FOSS self-hosted) or to the Dawarich Cloud one: https://dawarich.app
I’m still using
- free g suite
- play music
- finance
- nfc wallet is just google wallet isn’t it?
- chromecast, video and audio-only
I guess play music is now YouTube music, and doesn't have uploads, so that can be considered dead, but the others seem alive to me.
Which particular thing called Hangouts? There were at least two, frankly I’d say more like four.
Google and Microsoft are both terrible about reusing names for different things in confusing ways.
> Can't keep track of all the Google chat apps.
And Hangouts was part of that problem. Remember Google Talk/Chat? That was where things began, and in my family we never wanted Hangouts, Talk/Chat was better.
Allo, Chat, Duo, Hangouts, Meet, Messenger, Talk, Voice… I’ve probably forgotten at least two more names, knowing Google. Most of these products have substantial overlap with most of the rest.
I used Picasa and loved it, until I realized I want all my photos available from all my devices at all times and so gave in to Google Photos (for access, not backup)
I use SyncThing for that purpose. It syncs across my phone, my laptops, and my Synologies. But I don't sync all my photos.
I don't like the thought of providing Google thousands of personal photos for their AI training. Which will eventually leak to gov't agencies, fraudsters, and criminals.
I used Google Talk than Hangouts, but once they switched to Meet, I gave up on them. By then my family was all using Hangouts, and we never settled on a new service, because one of my siblings didn't want to support any chat services that don't freely give user information to the government, and the rest of us didn't want to use a chat platform that does freely give user information to the government.
Am I the only one salty about Google Podcasts? For me that was the straw that broke the camel’s back… I dropped Android, switched to iOS, and slowly phasing out the Google products in my life.
From what I can tell (since I am just finding out about this today), they stopped manufacturing the old Chromecast hardware, and at some point, will stop supporting the old devices. The old devices may stop working in the future, for example, because they sunset the servers. Like their thermostats. Who knows?
Yahoo pipes. It was so great at creating rss feeds and custom workflows. There are replacements now like Zapier and n8n but loved that. Also google reader which is mentioned multiple times already.
Yahoo Pipes was what internet should have been. We're so many decades into computing and that kind of inter-tool linking has only barely been matched by unix pipes.
Many companies are working very hard to make that impossible unfortunately. For example you can't get posts from public Facebook groups automatically, although that would be a really good source candidate. They used to allow it, but... not anymore.
I loved pipes. I had rss feeds from all the sites where I was sharing content collected up and formatted via pipes into a single rss feed that was pulled into a php blog.
Then all those sites I used to post on stopped supporting rss one by one and finally pipes was killed off.
For a while I used a python library called riko that did the same thing as pipes without the visual editor. I have to thank it for getting me off php and into python.
If anyone with time, money and resources wants to revive the ideas of Yahoo! Pipes then I would suggest using Node-RED[^1] as a good starting point.
It has the advantage of being open source, has well defined and stable APIs and a solid backend. Plus 10+ years of constant development with many learnings around how to implement flow based programming visually.
I used the Node-RED frontend to create Browser-Red[^2] which is a Node-RED that solely executes in the browser, no server required. It does not support all Node-RED functionality but gives a good feel for using Node-RED and flow based programming.
The second project with which I am using Node-RED frontend is Erlang-Red[^3] which is Node-RED with an Erlang backend. Erlang is better suited to flow based programming than NodeJS, hence this attempt to demonstrate that!
Node-RED makes slightly different assumptions than Yahoo! Pipes - input ports being the biggest: all nodes in Node-RED have either zero or one input wires, nodes in Yahoo! Pipes had multiple input wires.
A good knowledge of jQuery is required but that makes it simpler to get into the frontend code - would be my argument ;) I am happy to answer questions related to Node-RED, email in bio.
I can recommend Apache Camel (https://camel.apache.org) for similar data integration pipelines and even agentic workflows. There are even visual editors for Camel today, which IMHO make it extremely user friendly to build any kind of pipeline quickly.
I missed Yahoo Pipes a lot so I built something similar recently for myself :) I know there are a few alternatives out there, but had to scratch my own itch.
Pascal/Delphi - especially in the educational context.
Crazy fast compiler so doesn't frustrate trial & erroring students, decent type system without the wildness of say rust and all the basic programming building blocks you want students to grasp are present without language specific funkiness.
Delphi isn't dead - ver 13 was recently released - https://www.embarcadero.com/products/delphi. It's even cross platform, uses Skia as its graphics engine, its all very nice.
Iirc Delphi didn’t have threads, sockets, or OS integration (signals, file watching …). So it wasn’t suited to systems programming ie servers and services. It nailed gui applications, and that was a lot. Maybe freepascal has threads and sockets but imo it was too late.
Midori, Microsoft's capability-based security OS[1]. Rumor has it that it was getting to the point where it was able to run Windows code, so it was killed through internal politics, but who knows! It was the Fuchsia of its time...
I've heard someone at Microsoft describe it as a moonshot but also a retention project; IIRC it had a hundred plus engineers on it at one time, including a lot of very senior people.
Apparently a bunch of research from Midori made it into .NET so it wasn't all lost, but still...
The technical foundation seems interesting, but knowing Microsoft this would have just become yet another bloated mess with it's own new set of problems. And by now it would have equally become filled with spyware and AI "features" users don't want.
netflix falcor. the graphql hype killed a much better alternative for many usecases. there were only a few missing pieces and improvements such as a proxy based adapter layer for popular frontend frameworks. Im now the lonely last user hoping to find a way to reboot development
Sandstorm: it seemed quite nice with a lot of possibilities when it launched in 2014, but it didn’t really take off and then it moved to sandstorm.org.
The actual problem with Sandstorm wasn't the era in which it was released. It will probably have the same problems even if released today. The problem was its application isolation mechanism - especially the data isolation (I think they were called grains). The mechanism is technically brilliant. But it's a big departure from how apps are developed today. It means that you have to do non-trivial modifications to web applications before they can run on the platform. The platform is better for applications designed to run on it in the start. It should have been marketed as a platform for building web applications, rather than as one for just deploying them.
Sandstorm was a great idea, but in my opinion it was targeted wrong. It should have been a platform and marketplace for B2B SaaS, not B2C SaaS. Specifically, all the third-party services which typical web apps use could have been Sandstorm apps, like analytics, logging, email, customer service etc.
Vine. It was already pretty big back in 2013
but
Twitter had no idea what to do with it. TikTok actually launched just a few months before Vine was shut down and erased from the internet.
Whoever took the decision to kill Vine was an absolute moron, even without hindsight. It was square videos, how hard could it have been to shove an ads banner above it and call it a day? Incredible
Heroku? I know it's still around, though IDK who uses it, but I miss those days when it was thriving. One language, one deployment platform, one database, a couple plugins to choose from, everything simple and straightforward, no decision fatigue.
I often wonder, if AI had come 15 years earlier, would it have been a ton better because there weren't a billion different ways to do things? Would we have ever bothered to come up with all the different tech, if AI was just chugging through features efficiently, with consistent training data etc.?
I talked to some Heroku reps at a local tech conference a year or so ago; it was clear that they were instructed to not have any personal opinions of the shredding of the free tier, but they did admit in a roundabout way that it lost them a lot of customers - some they were glad to get rid of as they were gaming the goodwill and costing Heroku lots of money, but weren't sure if it was a good long term idea or not.
As soon as they put a persistent Salesforce brand banner across the top which did nothing but waste space and put that ugly logo in our face every day, my team started our transition off Heroku pretty much right away.
I use the core product for my SaaS apps. Great platform, does what it needs to do. Haven’t felt the need to switch. Sometimes tempted to move to a single VPS with Coolify or Dokku, but not interested in taking on the server admin burden.
My company still uses Heroku in production actually. Every time I see the Salesforce logo show up I wince, but we haven't had any issues at all. It continues to make deployment very easy.
Didn't they offer free compute? IIRC all free compute on the Internet went away with the advent of cryptocurrencies as it became practical to abuse the compute and translate it directly into money.
I think their main failure points were the following:
- not lowering prices as time went off. They probably kept a super-huger margin profit, but they’re largely irrelevant today
- not building their own datacenters and staying in aws. That would have allowed them to lower prices and gain even more market share. Everyone that has been in amazon/aws likely has seen the internal market rate for ec2 instances and know there’s a HUGE profit margin deriving by building datacenters. Add the recent incredible improvements to compute density (you can easily get 256c/512t and literally terabytes of memory in a 2u box) and you get basically an infinite money glitch.
Quartz Composer - Apple's "patch-based" visual programming environment. Drag out a bunch of nodes, wire them together, build a neat little GUI.
10+ years ago I'd regularly build all sorts of little utilities with it. It was surprisingly easy to use it to tap into things that are otherwise a lot more work. For instance I used it to monitor the data coming from a USB device. Like 3 nodes and 3 patches to make all of that work. Working little GUI app in seconds.
Apple hasn't touched it since 2016, I kind of hope it makes a comeback given Blender and more so Unreal Engine giving people a taste of the node based visual programming life.
You can still download it from Apple, and it still technically works but a lot of the most powerful nodes are broken in the newer OS's. I'd love to see the whole thing revitalized.
I was a hold out on smartphones for a while and I used to print out k5 articles to read while afk... Just such an amazing collection of people sharing ideas and communal moderation, editing and up voting.
I learned about so many wierd and wonderful things from that site.
The internet before advertising, artificial intelligence, social media and bots. When folks created startups in their bedrooms or garages. The days when google slogan was “don’t be evil”.
I really miss the like 8 year ago push where a lot of major projects were moving to IRC. It's too bad Freenode took the opportunity to jump the shark and killed the momentum.
I mean, they're intentionally buried in the name of capital. If you need more than a Google search to find them, of course no one will go to them.
I don't like the siloing our information to Discord being a comparison to old internet. We had indexable information in forums that is "lost", not in the literal sense, but because you wouldn't be able to find it without obsessive digging to find it again. Conversations in Discord communities are very surface level and cyclical because it's far less straight forward to keep track of and link to answers from last week let alone two years ago. It is profoundly sad, to be honest.
Tauri apps take advantage of the web view already available on every user’s system. A Tauri app only contains the code and assets specific for that app and doesn’t need to bundle a browser engine with every app.
Rendering will still use Edge/Chromium on a generic Windows machine.
Looking at firefox memory usage, i’m afraid the issue there is not memory safety but rather the average javascript developer being completely and blissfully unaware of and careless about memory memory usage of the software they write
I liked del.icio.us, it was online bookmark sharing, but with actual people I knew, and it had genuinely useful category tagging. I guess it was basically replaced with https://old.reddit.com and maybe twitter.
Isn’t Pinboard (Who bought delicious) very similar? I also see bookmarks of my friend there, recently switched to Raindrop though as it’s much more maintained.
ReactOS, the effort to create a free and open source Windows NT reimplementation.
It has been in existence in some form or another for nearly 30 years, but did not gain the traction it needed and as of writing it's still not in a usable state on real hardware. It's not abandoned, but progress on it is moving so slow that I doubt we'll ever see it be released in a state that's useful for real users.
It's too bad, because a drop in Windows replacement would be nice for all the people losing Windows 10 support right now.
On the other hand, I think people underestimate the difficulty involved in the project and compare it unfavorably to Linux, BSD, etc.
Unix and its source code was pretty well publicly documented and understood for decades before those projects started, nothing like that ever really existed for Windows.
They had no chance. Look how long it tooks for Wine to get where they are. Their project is Wine + a kernel + device drivers compatibility, and a moving target.
> I think people underestimate the difficulty involved in the project
I don't think people do, it sounds like a nearly impossible struggle, and at the end you get a Windows clone. I can't imagine hating yourself enough to work on it for an extended period of time for no money and putting yourself and your hard work in legal risk. It's a miracle we have Wine and serious luck that we have Proton.
People losing Windows 10 support need to move on. There's Linux if you want to be free, and Apple if you still prefer to be guided. You might lose some of your video games. You can still move to Windows 11 if you think that people should serve their operating systems rather than vice versa.
> ReactOS, the effort to create a free and open source Windows NT reimplementation.
Some projects creep along slowly until something triggers an interest and suddenly they leap ahead.
MAME's Tandy 2000 implementation was unusable, until someone found a copy of Windows 1.0 for the Tandy 2000, then the emulation caught up until Windows ran.
Maybe ReactOS will get a big influx of activity after Windows 10 support goes offline in a couple days, or even shortly after when you can't turn AI spying off, not even three times a year.
Not so long ago there was a leak of windows’ source code, up to xp and 2003 server… the leak was so complete there are videos on YouTube about people building and booting (!!!) windows from there.
And yet, no big leap in ReactOS (at least for now).
Leaks like this actually slow down ReactOS development.
The project is supposed to be a clean-room reverse engineering effort. If you even see Windows code, you are compromised, and should not work on ReactOS.
They need to train an LLM with the windows source code and ask it to write an windows clone.
Apparently copyright law only applies for humans, generative AI gets away with stealing because there is too much monetary interest involved in looking the other way.
I've heard people say this, and believed it myself for a long time, but recently I set up a windows XP VM and was shocked by how bad the quality of life was.
I think nostalgia is influencing this opinion quite a bit, and we don't realize the mountain of tiny usability improvements that have been made since XP
Wine, Proton and virtualization all got good enough that there's no need for a half-baked binary-compatible Windows reimplementation, and I think that took a lot of the oxygen out of what could have been energy towards ReactOS. It's a cool concept but not really a thing anybody requires.
Full vector dpi aware UI, with grid, complex animation, and all other stuff that html5/css didn’t have in 2018 but silverlight had even in 2010 (probable even earlier).
MVVM pattern, two-way bindings. Expression Blend (basically figma) that allowed designers create UI that was XAML, had sample data, and could be used be devs as is with maybe some cleanup.
Excellent tooling, static analysis, debugging, what have you.
Rendered and worked completely the same in any browser (safari, ie, chrome, opera, firefox) on mac and windows
If that thing still worked, boy would we be in a better place regarding web apps.
Unfortunately, iPhone killed adobe flash and Silverlight as an aftermath. Too slow processor, too much energy consumption.
I am happy this one died. It was just another attempt by Microsoft to sidestep open web standards in favor of a proprietary platform. The other notorious example is Flash, and both should be considered malware.
Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner?
> Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner?
Why do you think JavaScript is a problem? And a big enough problem to risk destroying open web standards.
TypeScript exists for the same reason things like mypy exists, and no one in their right mind claims that python's openness should be threatened just because static typing is convenient.
Though in principle they serve similar purposes there are some big differences though. Python with types is still just python. Typescript is a different language from JS (guess it a superset?) and it being controlled by a large company could be considered problematic.
I suppose JS could go in the same direction and adopt the typing syntax from TS as a non-runtime thing. Then the typescript compiler would become something like mypy, an entirely optional part of the ecosystem.
> A remote code execution vulnerability exists when Microsoft Silverlight decodes strings using a malicious decoder that can return negative offsets that cause Silverlight to replace unsafe object headers with contents provided by an attacker. In a web-browsing scenario, an attacker who successfully exploited this vulnerability could obtain the same permissions as the currently logged-on user. If a user is logged on with administrative user rights, an attacker could take complete control of the affected system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights. Users whose accounts are configured to have fewer user rights on the system could be less impacted than users who operate with administrative user rights.
Back in the day Microsoft sent someone to our university to demo all of their new and upcoming products. I remember Vista (then named Longhorn) and Silverlight being among them. I also remember people being particularly impressed by the demo they gave of the latter, but everything switfly falling apart when someone queried whether it worked in other browsers. This was at a time when IE was being increasingly challenged by browsers embracing open standards. So there was an element of quiet amusement/frustration in seeing them continue to not get it.
I loved silverlight. Before I got a “serious” job, I was a summer intern at a small civil engineering consultancy that had gradually moved into developing custom software that it sold mostly to local town/city/county governments in Arizona (mostly custom mapping applications; for example, imagine Google Maps but you can see an overlay of all the street signs your city owns and click on one to insert a note into some database that a worker needs to go repair it… stuff like that).
Lots of their stuff was delivered as Silverlight apps. It turns out that getting office workers to install a blessed plugin from Microsoft and navigate to a web page is much easier than distributing binaries that you have to install and keep up to date. And developing for it was pure pleasure; you got to use C# and Visual Studio, and a GUI interface builder, rather than the Byzantine HTML/JS/CSS ecosystem.
I get why it never took off, but in this niche of small-time custom software it was really way nicer than anything else that existed at the time. Web distribution combined with classic desktop GUI development.
Geocities ; It was a "put your html here" Free web hosting back when people barely knew what html was. Today you have to be a rocket scientist to find a way to host a free static "simple" page online.
Valid option - I used it myself for a very brief toe-dip into blogging earlier this year - but maybe worth noting that Google seems to flat-out refuse to crawl anything you put there. Won't pick it up by itself, won't read a sitemap you explicitly tell it about. It'll grudgingly index specific page URLs you tell it about, but that's kind of absurd. I don't know if it's because it's on a subdomain, or a Microsoft property, or because I was 100% ad- and tracker-free or what.
I tried DDG (Bing-backed, I believe) and it happily found everything with no manual intervention at all. That was the point where I ditched Google Search after 30 years.
tumblr is nothing like a webpage. LLMs were just invented 5 minutes ago and are losing money hand over fist until people are dependent, then will be very expensive to use; and you still have to figure out how to host, where to host, and how much it's going to cost you. So, I have no idea what you're getting at.
You could have said Wordpress.com or something. It's not quite a website, but it's close. It's also probably going to be Typepad (i.e. defunct) in a few years and Blogger is probably going to be there quicker than that.
I really liked Google Circles, a feature of Google+ social media. It allowed you to target content to specific groups of users. You could have a "family" circle or a "work" circle and not have to worry about cross posting something accidentally. It was a small thing but it made it really easy to manage your posts.
I loved my N900, and my N800 before that, and I would have loved to have seen successors. Ultimately, I ended up switching to Android because I was tired of things only available as apps. Since then, web technologies have gotten better, and it's become much more feasible to use almost exclusively websites.
They should have partnered not only with Intel, but with Palm, RIM or whatever other then-giant to rival Android. Those two went their own ways with WebOS and buying QNX, so maybe they could have agreed to form a consortium for an open and interoperable mobile OS
Microsoft Songsmith is another one that deserved a second life. It let you hum or sing a melody and would auto-generate full backing tracks, guitar, bass, drums, chords, in any style you chose.
It looked a bit goofy in the promo videos, but under the hood it was doing real-time chord detection and accompaniment generation. Basically a prototype of what AI music tools like Suno, Udio, or Mubert are doing today, fifteen years too early.
If Microsoft had kept iterating on it with modern ML models, it could’ve become the "GarageBand for ideas that start as a hum."
It was a series of experiments with new approaches to programming. Kind of reminded me of the research that gave us Smalltalk. It would have been interesting to see where they went with it, but they wound down the project.
The Lockheed D-21 drone. Supersonic ramjet without the complexity of scramjet or the cost of turbojet, hamstrung by the need for a manned launch platform (making operations safety-critical… with predictable results) and recovery to get data off it. Twenty or forty years later it would have been paired by a small number of high-cost launcher UAVs and had its cost driven down to disposable, with data recovery over radio comms… but twenty to forty years later there’s nothing like it, and the maturation of satellites means there almost certainly never will be.
Boot2Gecko or whatever the browser as Operating system was called. This was a project that should have focused on providing whatever its current users needed expanding and evolving to do whatever those users wanted it to do better.
Instead it went chasing markets, abandoning existing users as it did so, in favour of potential larger pools of users elsewhere. In the end it failed to find a niche going forward while leaving a trail of abandoned niches behind it.
For a few short months circa 2016 or 2017, KaiOS was the number one mobile OS in India. This was probably because of all the ultra-cheap KaiOS-powered Reliance Jio phones flooding the Indian market at the time.
I noticed the trend when I was working on a major web property for the Aditya Birla conglomerate. My whole team was pleasantly surprised, and we made sure to test everything in Firefox for that project. But everyone switched to Android + Chrome over the next few years, which was a shame.
I adored my Firefox Phones. Writing apps was so easy I built myself dozens of little one-offs. Imagine if it had survived to today, its trivial html/css/js apps could be vibe coded on-device and be the ultimate personalized phone.
Luckily it wasn't long after Mozilla abandoned it that PWAs were introduced and I could port the apps I cared about.
In 2011, before TypeScript, Next.js or even React, they had seamless server-client code, in a strongly typed functional language with support for features like JSX-like inline HTML, async/await, string interpolation, built-in MongoDB ORM, CSS-in-JS, and many syntax features that were added to ECMAScript since then.
I find it wild how this project was 90%+ correct on how we will build web apps 14 years later.
Non Daw. Its breaking up each function of the DAW into its own application gave a better experience in each of those functions, especially when you only needed that aspect, you were not working around everything else that the DAW offers. The integration between the various parts was not all that it could be but I think the idea has some real potential.
Thought about Non immediately, but I figured it must have (had) about 2 other users amongst HNers, though. :) Nice to see it mentioned.
I used it quite a bit to produce radio shows for my country's public broadcasting. Because Non's line-oriented session format was so easy to parse with classic Unix tools, I wrote a bunch of scripts for it with Awk etc. (E.g. calculating the total length of clips highlighted with brown color in the DAW -- which was stuff meant for editing out; or creating a poor man's "ripple editing" feature by moving loosely-placed clips precisely side by side; or, eventually, converting the sessions to Samplitude EDL format, and, from there, to Pro Tools via AATranslator [1] (because our studio was using PT), etc. Really fun times!)
I've never heard of this software before. Any idea why it's discontinued? There are a bunch of weird messages that point to sort of a hostile take over of the project by forking, but it doesn't say anything about why or how it was discontinued.
RethinkDB. Technically it still exists (under The Linux Foundation), but (IMO) the original company's widening scope (the Horizon BaaS) that eventually led to its demise killed its momentum.
Macromedia Flash. Its scope and security profile was too big. It gave way to HTML’s canvas. But man, the tooling is still no where near as good. Movieclips, my beloved. I loved it all.
The iPhone killed Flash, probably because it would've been a way to create apps for it, more probably because it would've been laggy in the 2007 hardware, and people would've considered the iPhone "a piece of junk".
Interesting how Flash became the almost universal way to play videos in the browser, in the latter half of the 2000's (damn I'm old...).
It's incredible to me that they killed the whole tool instead of making a JS/Canvas port. Even without "full flash websites", there's still need for vectorial animations on the web.
As a Linux user, I hated Flash with a passion. It mostly didn't work despite several Linux implementations. About the time they sorted all the bugs out, it went away. Good riddance.
Was recently reading about Project Ara, the modular smartphone project by Google/Motorola [1]. Would have liked to see a few more iterations of the idea. Something more customizable than what we have today without having to take the phone apart.
ICQ ; It was the first instant messenger, the technology could have adopted voice (and not get disrupted by Skype) and mobile (and not get disrupted by whatsapp) and group chat (and not get disrupted by slack/discord). But they didn't even try and put up a fight.
They got bought by AOL in 98, long before most/all of this innovation happened?
Edit: in fact I'd say they were irrelevant before pretty much all of those innovations. By the time AIM or MSN Messenger really became popular, ICQ didn't matter anymore.
Skype ; Because my R.I.P. grandma was using it to talk to her relatives overseas just like she would use a phone, but it didn't cost an arm and a leg (unlike phone calls).
One of the best P2P software at the time. It was so simple and effective and allowed people to call real phones with Skype credit.
A genius product ripped my Microsoft. Have you used Microsoft Teams recently? Bad UI, hard to configure external hardware and good level of incompatibility, missing the good old "Echo / Sound Test Service". At a point I even installed Skype of my old Android but was sucking up too much battery.
BT had this grand vision for basically providing rich multi-media through the phone line, but in ~1998. Think a mix of on-demand cable and "teleconferencing" with TV based internet (ceefax/red button on steriods)
It would have been revolutionary and kick started the UK's jump into online rich media.
However it wouldnt have got past the regulators as both sky and NTL(now virgin) would have protested loudly.
I tried it twice and the onboarding experience was insurmountable. Never managed to achieve a critical mass of followers or whatever they call it, so things were permanently read-only for me. I'd reply but nobody saw it.
It was a fascinating protocol underneath, but the social follow structure seemed to select strongly for folks who already had a following or something.
Drama has killed the technological progress in open source, if you ask me.
Having seen what goes on in the foss world and what goes on in the large faang-size corporate world, no wonder the corporate world is light-years ahead.
You don't need hierarchy, but you need some sort of process. "Consensus-based" just means that the loudest and most enduring shouters get their way, and when their way fails spectacularly, they leave in a huff (taking their work with them, badmouthing the project, and likely starting a fork that will pull more people out of the project and confuse potential users who just bail on trying either.)
Those people need to be pushed out early and often. That's what voting is for. You need a supermajority to force an end to discussion, and a majority to make a decision. If you hold up the discussion too long with too slim a minority, the majority can fork your faction out of the group. If the end of debate has been forced, and you can't work with the majority, you should leave yourself.
None of this letting the bullies get their way until everything is a disaster, then splitting up anyway stuff.
It might be too soon to call it abandoned, but I was very intrigued by the Austral [1] language. The spec [2] is worth reading, it has an unusual clarity of thought and originality, and I was hoping that it would find some traction. Unfortunately it seems that the author is no longer actively working on it.
I played with Austral about a year ago and really wanted to use it for my projects, but as a hobbyist and mostly inept programmer it lacked the community and ecosystem I require. I found it almost intuitive and the spec does an amazing job of explaining the language. Would love to see it get a foothold.
The author got hired by Modular, the AI startup founded by the creators of LLVM and Swift, and is now working on the new language Mojo.
He’s been bringing a bunch of ideas from Vale to Mojo
Oh nice! I just had an excuse to try mojo via max inference, it was pretty impressive. Basically on par with vllm for some small benchmarks, bit of variance in ttft and tpot. Very cool!
It's been a number of years but my understanding was they kind of killed all the momentum it had by removing support for custom operators which broke everyone's code?
Yeah, Opa was wildly ahead of its time, I actually just wrote a top level comment about it. Basically Next.js+TypeScript+modern ECMAScript features, but in 2011.
A simple UI programming pattern, with a circular, unidirectional data flow. It is very rigid by design, to be side-effect free, functional, unidirectional:
MS Sidewinder Force Feedback Pro (1997) and Sidewinder Force Feedback 2 (USB).
You can buy similar today, but nowhere near the pricepoint. Also the out of the box support by Windows has vanished, and therefore the incentive of game developers to include force feedback.
CLPM, the Common Lisp Package Manager. The Quicklisp client doesn't do HTTPS, ql-https doesn't do Ultralisp, and OCICL (which I'm currently using) doesn't do system-wide packages. CLPM is a great project, but it's gone neglected long enough that it's bitrotted and needs some thorough patching to be made usable. Fortunately Common Lisp is still as stable as it has been for 31 years, so it's just the code which interacts with 3rd-party libraries that needs updating.
Yeah I felt that Quicklisp doesn't have the same features as package managers in other languages, and https is one of them. Also it's run by a single person which doesn't have too much time to constantly update the libraries.
In comparison I found Clojars^[0] for Clojure better and community driven like NPM. But obv Clojure has more business adoption than CL.
Lazarus is nice but both its apis and the ui feel like they're still stuck in the early 00's.
It's not enough to look like VB6 / Delphi these days; you've got to keep up with what kinds of conventions we expect now.
It's a real shame its raster functionality wasn't integrated into Illustrator. Adobe really butchered the whole Macromedia portfolio, didn't they?
(For those unfamiliar, Illustrator is a pure vector graphics editor; once you rasterize its shapes, they become uneditable fixed bitmaps. Fireworks was a vector graphics editor that rendered at a constant DPI, so it basically let you edit raster bitmaps like they were vectors. It was invaluable for pixel-perfect graphic design. Nothing since lets you do that, though with high-DPI screens and resolution-independent UIs being the norm these days, this functionality is less relevant than it used to be.)
At my last job m our designer was a Fireworks holdout. It was very pleasant. As someone who has to implement UIs, I greatly preferred it to Figma, though with today's flat boring designs there's a lot less slicing.
Nokia Maps. There was a brief period in the early 2010s where Nokia had the best mapping product on the planet, and it was given away for free on Lumia phones at a time when TomTom and Garmin were still charging $60+ for navigation apps.
10/GUI did some deep thinking about the limitations and potential of the (then-fairly new) multi touch input method.
I wished something more had come out of it, instead it stayed a niche concept art video that is mostly forgotten now.
I’m not arguing the solutions it outlined are good, but I think some more discussion around how we interact with touch screens would be needed. Instead, we are still typing on a layout that was invented for mechanical typewriters - in 2025, on our touch screens.
I've argued this for years on this site...but AOL.
At its best, having IM, email, browser, games, keywords, chats, etc. was a beautiful idea IMO. That they were an ISP seemed secondary or even unrelated to the idea. But they chose to charge for access even in the age of broadband, and adopt gym level subscription tactics to boot, and people decided they'd rather not pay it which is to be expected. I often wonder if they'd have survived as a software company otherwise.
They were basically a better thought out Facebook before Facebook, in my opinion.
RAM Disks. Basically extremely fast storage using RAM sticks slotted into a specially made board that fit in a PCIe slot. Not sure what happened to the project exactly but the website disappeared sometime in 2023.
The idea that you could read and write data at RAM speeds was really exciting to me. At work it's very common to see microscope image sets anywhere from 20 to 200 GB and file transfer rates can be a big bottleneck.
Products to attach RAM to expansion slots have long existed and continue to be developed. It's a matter of adding more memory once all of the DIMMs are full.
What to do with it, once it's there, is a concern of software, but specialized hardware is needed to get it there.
VPRI, I was really hoping it would profoundly revolutionise desktop application development and maybe even lead to a new desktop model, and instead they wound up the project without having achieved the kind of impact I was dreaming of.
The TUNES [1] operating system and programming language project.
The reason for its failure are described perfectly on the archival website:
> TUNES started in 1992-95 as an operating system project, but was never clearly defined, and it succumbed to design-by-committee syndrome and gradually failed. Compared to typical OS projects it had very ambitious goals, which you may find interesting.
I always thought Microsoft Popfly had huge potential and was way ahead of its time. It made building web mashups feel like playing with Lego blocks, drag, drop, connect APIs, and instantly see the result.
If something like that existed today, powered by modern APIs and AI, it could become the ultimate no-code creativity playground.
Connect your phone to a display, mouse, keyboard and get a full desktop experience.
At the time smartphones were not powerful enough, cables were fiddly (adapters, HDMI, USB A instead of a single USB c cable) and virtualization and containers not quite there.
Today, going via pkvm seems like promising approach. Seamless sharing of data, apps etc. will take some work, though.
Anyone remember Openmoko, the first commercialised open source smart phone. Was heaps buggy though, not really polished, etc. It’s only redeeming feature was the open source software and hardware (specs?).
There was the https://en.wikipedia.org/wiki/PinePhone and it's successor PinePhonePro. Bugginess and general impracticalities brought up to more recent standards. Inflation-adjusted, of course!
People talk so much about how you need to write code that fits well within the rest of the codebase, but what tools do we have to explore codebases and see what is connected to what? Clicking through files feels kind of stupid because if you have to work with changes that involve 40 files, good luck keeping any of that in your working memory. In my experience, the JetBrains dependency graphs also aren't good enough.
Sourcetrail was a code visualization tool that allowed you to visualize those dependencies and click around the codebase that way, see what methods are connected to what and so on, thanks to a lovely UI. I don't think it was enough alone, but I absolutely think we need something like this: https://www.dbvis.com/features/database-management/#explore-... but for your code, especially for codebases with hundreds of thousands or like above a million SLoC.
I yearn to some day view entire codebases as graphs with similarly approachable visualization, where all the dependencies are highlighted when I click an element. This could also go so, so much further - you could have a debugger breakpoint set and see the variables at each place, alongside being able to visually see how code is called throughout the codebase, or hell, maybe even visualize every possible route that could be taken.
Flickr - that was the future of photo storage, sharing, discovery.
What was the bookmarks social tool called from 00’s? I loved it and it fell off the earth. You could save your bookmarks, “publish” them to the community, share etc..
What ever happened to those build your own homepage apps like startpage (I think)? I always thought those would take off
Dreamweaver or some other real WYSISYG web page editor that could maybe deal with very basic JavaScript.
I just wanna make a mostly static site with links in and out of my domain. Maybe a light bit of interactivity for things like search that autocompletes.
>This presentation introduces Via, a virtual file system designed to address the challenges of large game downloads and storage. Unlike cloud gaming, which suffers from poor image quality, input latency, and high hosting costs, Via allows games to run locally while only downloading game data on demand. The setup process is demonstrated with Halo Infinite, showing a simple installation that involves signing into Steam and allocating storage space for Via's cache.
>Via creates a virtual Steam library, presenting all owned games as installed, even though their data is not fully downloaded. When a game is launched, Via's virtual file system intercepts requests and downloads only the necessary game content as it's needed. This on-demand downloading is integrated with the game's existing streaming capabilities, leveraging features like level-of-detail and asset streaming. Performance metrics are displayed, showing download rates, server ping, and disk commit rates, illustrating how Via fetches data in real-time.
>The system prioritizes caching frequently accessed data. After an initial download, subsequent play sessions benefit from the on-disk cache, significantly reducing or eliminating the need for network downloads. This means the actual size of a game becomes less relevant, as only a portion of it needs to be stored locally. While server locations are currently limited, the goal is to establish a global network to ensure low ping. The presentation concludes by highlighting Via's frictionless user experience, aiming for a setup so seamless that users are unaware of its presence. Via is currently in early access and free to use, with hopes of future distribution partnerships.
I'm amazed the video still has under 4,000 views. Sadly, Flaherty got hired by XAI and gave up promoting the project.
Wait until you hear that almost all Unity games don't really have asset streaming because the engine loads things eagerly by default.
I don't see how this could take off. Internet speeds are getting quicker, disk space is getting cheaper, and this will slow down load times. And what's worse is the more you need this tech the worse experience you have.
Knowing when to say "no" to a project is an important skill.
One always must define a one sentence goal or purpose, before teams think about how to build something.
Cell processors, because most coders can't do parallelism well
Altera consumer FPGA, as they chose behavioral rather than declarative best practices... then the Intel merger... metastability in complex systems is hard, and most engineers can't do parallelism well...
World Wide Web, because social-media and Marketers
Dozens of personal projects, because sometimes things stop being fun. =3
Just on principle, I'd have liked to see it on the market for more than 49 days! It pains me as an engineer to think of the effort to bring a hardware device to market for such a minuscule run.
CueCat it was an affordable barcode scanner that anyone could have connected to their computer, and it scanned barcodes. It took almost two decades before we could finally do it again with our mobile phones.
The IBM school's computer. Developed by IBM Hursley in 1967, it was years ahead in its design, display out to a television and storage on normal audio tape. Would have kick started an educational revolution if it had been launched beyond the 10 prototype machines.
XenClient. I would really love to have some minimal OS HyperVisor running, and then you slap multiple OSes on top of that w/ easy full GUI switching via some hotkeys like Ctrl+Shift+F1. Additionaly, special drivers to virtualize Gfx and Sfx devices so every VM have full desktop capabilities and low latency.
Unfortunately, it died because its very niche and also they couldnt keep up with development of drivers for desktops.. This is even worse today...
I came to say Opa too. I liked the language but the meteor-like framework it was bundled with, while nice for prototyping, was a pain to work around when it didn't do what you needed.
That said, frameworks were all the buzz back in the day, so the language alone probably wouldn't have gone anywhere without it.
Mozilla heka. As far as data collection and processing goes, we are still stuck with Logstash after all of these years. Heka promised a much more efficient solution, being implemented with Go and Lua plugins.
In the late 90s there was a website called fuckedcompany which was a place where people could spill the beans about startups (mainly in silicon valley). It was anonymous and a pretty good view into the real state of tech. Now there is twitter/x but it's not as focused on this niche.
The closest sites I've found are Web3 is Going Just Great and Pivot to AI, which are newsfeeds of various car crashes in their respective hype arenas, although without any insider scoops/gossip.
wua.la … the original version. You share part of your storage to get the same amount back as resilient cloud storage from others. Was bought and killed by LaCie (now Seagate). They later provided paid-for cloud storage under the same name but it didn’t take off.
Pivotal Tracker ; Users loved it, it had an excellent model for tracking work and limiting work in progress on software projects. There is no real good alternative and the usual suspects for tracking project work are horrible in comparison.
https://www.kite.com for python
i first learned about it when i was working in an university group and had the task to transform a windowing algorithm already working on matlab to python.
it felt like a modern linter and lsp with additional support through machine learning. i don't quite know why it got comparative small recognition, but perhaps enough to remain an avantgarde pioneering both python and machine learning support for further generations and wider applications.
i first learned about it when i was working in an university group and had the task to transform a windowing algorithm already working on matlab to python.
it felt like a modern linter and lsp with additional support through machine learning. i don't quite know why it got comparative small recognition, but perhaps enough to remain an avantgarde pioneering both python and machine learning support for further generations and wider applications.
Google Wave ; It had a bunch of agents participating in editing the text together with you, making spelling fixes, finding additional information to enrich your content, and so much more.
I'm booting and running Haiku on my Thinkpad. It's a from-scratch workalike of BeOS, and able to run Be software. Though, frankly, Be software is totally 1990s, so a lot of Linux software written for Qt has been ported to Haiku.
In the end I wound up with basically the same application software as on my Debian desktop, except running on Haiku instead of Linux. Haiku is noticeably snappier and more responsive than Linux+X+Qt+KDE, though.
In late September or early October 1996, Fry's Electronics places a full page promo ad on the back of the business section of the San Jose Mercury News for OS/2 4.0 "WRAP [sic]" in 256 pt font in multiple places. Oops!
Nah, that time has passed and there's not much to miss from the base OS. What would be interesting is for IBM to publish the source to the Workplace Shell and the underlying SOM code so it might get a new life running on one of the free *nixes.
I could think of many examples, but I'll talk about the top four that I have in mind, that I'd like to see re-evaluated for today's times.
1. When Windows Vista was being developed, there were plans to replace the file system with a database, allowing users to organize and search for files using database queries. This was known as WinFS (https://en.wikipedia.org/wiki/WinFS). I was looking forward to this in the mid-2000s. Unfortunately Vista was famously delayed, and in an attempt to get Vista released, Microsoft pared back features, and one of these features was WinFS. Instead of WinFS, we ended up getting improved file search capabilities. It's unfortunate that there's been no proposals for database file systems for desktop operating systems since.
2. OpenDoc (https://en.wikipedia.org/wiki/OpenDoc) was an Apple technology from the mid-1990s that promoted component-based software. Instead of large, monolithic applications such as Microsoft Excel and Adobe Photoshop, functionality would be offered in the form of components, and users and developers can combine these components to form larger solutions. For example, as an alternative to Adobe Photoshop, there would be a component for the drawing canvas, and there would be separate components for each editing feature. Components can be bought and sold on an open marketplace. It reminds me of Unix pipes, but for GUIs. There's a nice promotional video at https://www.youtube.com/watch?v=oFJdjk2rq4E.
OpenDoc was a radically different paradigm for software development and distribution, and I think this was could have been an interesting contender against the dominance that Microsoft and Adobe enjoys in their markets. OpenDoc actually did ship, and there were some products made using OpenDoc, most notably Apple's Cyberdog browser (https://en.wikipedia.org/wiki/Cyberdog).
Unfortunately, Apple was in dire straits in the mid-1990s. Windows 95 was a formidable challenger to Mac OS, and cheaper x86 PCs were viable alternatives to Macintosh hardware. Apple was an acquisition target; IBM and Apple almost merged, and there was also an attempt to merge Apple with Sun. Additionally, the Macintosh platform depended on the availability of software products like Microsoft Office and Adobe Photoshop, the very types of products that OpenDoc directly challenged. When Apple purchased NeXT in December 1996, Steve Jobs returned to Apple, and all work on OpenDoc ended not too long afterward, leading to this now-famous exchange during WWDC 1997 between Steve Jobs and an upset developer (https://www.youtube.com/watch?v=oeqPrUmVz-o).
I don't believe that OpenDoc fits in with Apple's business strategy, even today, and while Microsoft offers component-based technologies that are similar to OpenDoc (OLE, COM, DCOM, ActiveX, .NET), the Windows ecosystem is still dominated by monolithic applications.
I think it would have been cool had the FOSS community pursued component-based software. It would have been really cool to apt-get components from remote repositories and link them together, either using GUI tools, command-line tools, or programmatically to build custom solutions. Instead, we ended up with large, monolithic applications like LibreOffice, Firefox, GIMP, Inkscape, Scribus, etc.
3. I am particularly intrigued by Symbolics Genera (https://en.wikipedia.org/wiki/Genera_(operating_system)), an operating system designed for Symbolics Lisp machines (https://en.wikipedia.org/wiki/Symbolics). In Genera, everything is a Lisp object. The interface is an interesting hybrid of early GUIs and the command line. To me, Genera could have been a very interesting substrate for building component-based software; in fact, it would have been far easier building OpenDoc on top of Common Lisp than on top of C or C++. Sadly, Symbolics' fortunes soured after the AI winter of the late 1980s/early 1990s, and while Genera was ported to other platforms such as the DEC Alpha and later the x86-64 via the creation of a Lisp machine emulator, it's extremely difficult for people to obtain a legal copy, and it was never made open source. The closest things to Genera we have are Xerox Interlisp, a competing operating system that was recently made open source, and open-source descendants of Smalltalk-80: Squeak, Pharo, and Cuis-Smalltalk.
4. Apple's "interregnum" years between 1985 and 1996 were filled with many intriguing projects that were either never commercialized, were cancelled before release, or did not make a splash in the marketplace. One of the most interesting projects during the era was Bauhaus, a Lisp operating system developed for the Newton platform. Mikel Evins, a regular poster here, describes it here (https://mikelevins.github.io/posts/2021-07-12-reimagining-ba...). It would have been really cool to have a mass-market Lisp operating system, especially if it had the same support for ubiquitous dynamic objects like Symbolic Genera.
Re: obtaining a legal copy of Genera, as of 2023 Symbolics still existed as a corporate entity and they continued to sell x86-64 laptops with "Portable Genera 2.0". I bought one from them then, and occasionally see them listing some on Ebay. (This isn't intended as an advertisement or endorsement, just a statement. I think it's quite unfortunate that Symbolics's software hasn't been made freely available, since it's now really only of historical interest.)
I'm intrigued by Symbolics Genera too. It would have been interesting seeing further development of Lisp OS, especially when they would have had internet connection. Rewriting part of your OS and see the changes in real time? Maybe web apps could have been just software written in Lisp, downloaded on the machine and directly being executed in a safe environment on top of the Genera image. Big stuff.
OpenDoc was mostly given to Taligent (the Apple and IBM joint venture) to develop. It was full-on OO: about 35 files for a minimal application, which meant that Erich Gamma had to build a whole new type of IDE which was unusable. He likely learned his lesson: it's pretty hard to define interfaces between unknown components without forcing each one to know about all the others.
MIME types for mail addressed much of the demand for pluggable data types.
Windows Phone's UI is still with us, from Windows 8 onwards. Everything on 8, 10, and 11 is optimized for a touch interface on a small screen, which is ridiculous on a modern desktop with a 32" or so monitor and a trackball or mouse.
False. The Metro design was abandoned long ago. No live tiles, no typography-first minimal UIs in windows 10/11. I pin an email app to taskbar/start, I don't see the unread count.
From Windows 10, there is a switch between desktop and touch mode.
They stopped supporting small tablets some years ago though, and made it worse with every Windows update. I can only surmise that it was to make people stop using them. Slow GUI, low contrast, killed apps.
Fro me, DESQview. Microsoft tried to buy it in order to use its tech in their windows system. I wonder how things would be today if they were able to purchase it. But DESQview said "no".
Instead it went into a slow death spiral due to Windows 95.
Love seeing this one. My uncle was co-founder of Quarterdeck, and I grew up in a world of DESQview and QEMM. It was a big influence on me as a child.
Got a good family story about that whole acquisition attempt, but I don't want to speak publicly on behalf of my uncle. I know we've talked at length about the what-ifs of that moment.
I do have a scattering of some neat Quarterdeck memorabilia I can share, though:
DESQview/X sucked the wind out of DESQview's sails. It was, on paper, a massive upgrade. I had been running DESQview for years, with a dial-up BBS in the background.
But you couldn't actually buy /X. After trying to buy a copy, my publisher even contacted DESQ's marketing people to get a copy for me, and they wouldn't turn one over. Supposedly there were some copies actually sold, but too few, too late, and then /X was dropped. There was at least one more release of plain DESQview after that, but by then Windows was eating its lunch.
OSI's session layer did very little more than TCP/UDP port numbers; in the OSI model you would open a connection to a machine, then use that connection to open a session to a particular application.
X.400 was a nice idea, but the ideal of having a single global directory predates security. I can understand why it never happened
On X.509, the spec spends two chapters on attribute certificates, which I've never seen used in the wild. It's a shame; identity certificates do a terrible job at authentication
Fortress language. It suffered from being too Haskell-like in terms of too many, non-orthogonal features. Rust and Go applied lessons from it perhaps indirectly.
their operator precedence system was one of my favourite pieces of language design. the tl;dr was that you could group operators into precedence sets, and an expression involving operators that all came from the same set would have that set's precedence rules applied, but if you had an expression involving mixed sets you needed to add the parentheses. crucially, they also supported operator overloading, and the same operator could be used in a different set as long as everything could be parsed unambiguously. (caveat, I never used the language, I just read about the operator design in the docs and it was very eye opening in the sense that every other language's operator precedence system suddenly felt crude and haphazard)
LSR, the "Linux Screen Reader", an ambitiousy designed Python implementation of a GUI screen reader developed by IBM starting around 2006 or so. The project was ended 2008 when IBM ended all its Accessibility involvement in FLOSS.
Humane AI Pin. I think they launched 2 years too early and were too greedy with device pricing and subscription. Also if they focused as accessory for Android/iPhone they could reduce power usage and cost as well.
Their execution was of course bad but I think today current LLM models are better and faster and there is much more OSS models to reduce costs. Hardware though looked nice and pico projector interesting concept even though not the best executed.
Wine predates ReactOS. It was basically a FOSS duplicate of Sun's WABI.
I wrote a bunch of software in Borland Delphi, which ran in Windows, Wine, and ReactOS with no problems. Well, except for ReactOS' lack of printing support.
As long as you stay within the ECMA or published Windows APIs, everything runs fine in Wine and ReactOS. But Microsoft products are full of undocumented functions, as well as checks to see if they're running on real Windows. That goes back to the Windows 3.1 days, when 3.1 developers regularly used OS/2 instead of DOS, and Microsoft started adding patches to fail under OS/2 and DR-DOS. So all that has to be accounted for by Wine and ReactOS. A lot of third-party software uses undocumented functions as well, especially stuff written back during the days when computer magazines were a thing, and regularly published that kind of information. A lot of programmers found the lure of undocumented calls to be irresistible, and they wound up in all kinds of commercial applications where they really shouldn't have been.
In my experience anything that will load under Wine will run with no problems. ReactOS has some stability problems, but then the developers specifically call it "alpha" software. Despite that, I've put customers on ReactOS systems after verifying all their software ran on it. It gets them off the Microsoft upgrade treadmill. Sometimes there are compatibility problems and I fall back to Wine on Linux. Occasionally nothing will do but real Windows.
Hard disagree. The Humane AI Pin ad was a classic silicon valley ad that screamed B2VC and demonstrated nothing actually useful that couldn't be done with an all-in-one phone app (or even the ChatGPT app) and bluetooth earbuds that you already have.
Which reduces its innovation level to nothing more than a chest-mounted camera.
You want real B2C products that people would actually buy? Look at the Superbowl ads instead. Then watch the Humane ad again. It's laughable.
Ceylon, JVM language, developed by Red Hat, now abandoned at Eclipse. Lost the race with Kotlin but proposed more than just syntax sugar over Java. Anonymous union types, comprehensions, proper module system...
nah, glass was impressive for a such a big org like google, but smartphones are popular because people use them like portable televisions. glanceable info and walking directions are more like an apple watch sized market, without the fashion element. meta is about to find out.
google glass sucks though and glasses will never be a thing. google and meta and … can spend $8T and come up with the most insane tech etc but no one will be wearing f’ing glasses :)
Apple’s scanning system for CSAM. The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
It was an extremely interesting effort where you could tell a huge amount of thought and effort went into making it as privacy-preserving as possible. I’m not convinced it’s a great idea, but it was a substantial improvement over what is in widespread use today and I wanted there to be a reasonable debate on it instead of knee-jerk outrage. But congrats, I guess. All the cloud hosting systems scan what they want anyway, and the one that was actually designed with privacy in mind got screamed out of existence by people who didn’t care to learn the first thing about it.
Good riddance to a system that would have provided precedent for client-side scanning for arbitrary other things, as well as likely false positives.
> I wanted there to be a reasonable debate on it
I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
We need to just keep making it clear the answer is "no", and hopefully strengthen that to "no, and perhaps the massive smoking crater that used to be your political career will serve as a warning to the next person who tries".
This. No matter how cool the engineering might have been, from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for… Apple was very much creating the Torment Nexus from “Don’t Create the Torment Nexus.”
> from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for…
I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of?
I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
The problem isn’t the system as implemented; the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”
Once that idea appears, it allows every lobbyist and insider to say “mandate this, we’ll do something like what Apple did but for other types of Bad People” and all of a sudden you have regulations that force messaging systems to make this possible in the name of Freedom.
Remember: if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image. There are many in politics for whom that level of control is the actual goal.
> the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”
Apple never made that assertion, and the system they designed is incapable of doing that.
> if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image.
Apple’s system cannot do that. If you change parts of it, sure. But the system they proposed cannot.
To reiterate what I said earlier:
> The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
So far, you are saying that you don’t have a problem with the system Apple designed, and you do have a problem with some other design that Apple didn’t propose, that is significantly different in multiple ways.
Also, what do you mean by “model”? When I used the word “model” it was in the context of using another system as a model. You seem to be using it in the AI sense. You know that’s not how it worked, right?
> Chat Control, and other proposals that advocate backdooring individual client systems.
Chat Control is older than Apple’s CSAM scanning and is very different from it.
> Clients should serve the user.
Apple’s system only scanned things that were uploaded to iCloud.
You missed the most important part of my comment:
> I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
I don’t think you can accurately describe it as client-side scanning and false positives were not likely. Depending upon how you view it, false positives were either extremely unlikely, or 100% guaranteed for practically everybody. And if you think the latter part is a problem, please read up on it!
> I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
Right, well I wanted a debate. And Apple changed their minds. So how is it reminding you of that? Neither of those things apply here.
No, but I have a hard time imagining a bug that would meaningfully compromise this kind of system. Can you give an example?
> How about making Apple vulnerable to demands from every government where they do business?
They already are. So are Google, Meta, Microsoft, and all the other giants we all use. And all those other companies are already scanning your stuff. Meta made two million reports in 2024Q4 alone.
There is no place for spyware of any kind on my phone. Saying that it is to "protect the children" and "to catch terrorists" does not make it any more acceptable.
I believe my retro Nokia phones s60/s90 does not have any spyware. I believe earlier Nokia models like s40 or monochrome does not even have an ability to spy on me (but RMS considers triangulation as spyware). I don't believe any products from the duopoly without even root access are free from all kinds of vendor's rootkits.
Apple designed a system. People guessed at what it did. Their guesses were way off the mark. This poisoned all rational discussion on the topic. If you imagine a system that works differently to Apple’s system, you can complain about that imaginary system all you want, but it won’t be meaningful, it’s just noise.
You understand it just fine, you're just trying to pass you fantasy pod immutable safe future as rational while painting the obvious objections based on the real world as meaningless noise.
Your point did not come across. It still isn’t. I don’t know what you mean by “pass you fantasy pod immutable safe future as rational”. You aren’t making sense to me. I absolutely do not “understand it just fine”.
If they are running safe mandatory scans on your phones for this, you seem shocked and angry that anyone would imply that this would lead to safe mandatory scans on your phones for that and the other, and open the door for unsafe mandatory scans for whatever.
If you can't acknowledge this, it puts you in a position where you can't be convincing to people who need you to deflect obvious, well-known criticisms before beginning a discussion. It gives you crazy person or salesman vibes. These are arguments that someone with a serious interest in the technology would be aware of already and should be included as a prerequisite to being taken seriously. Doing this shows that you value other people's time and effort.
Founder perspective: “avoid patents by staying 20 years behind” is the tragedy.
I published a 2-page CC0 initiative that splits protection into two layers:
• GLOBAL layer — fast, low-friction recognition for non-strategic inventions
• LOCAL-STRATEGIC layer — conventional national control for sensitive tech
Goal: cut admin drag/time-to-market while keeping sovereignty intact.
It's the closest thing to a Unix successor we ever got, taking the "everything is a file" philosophy to another level and allowing to easily share those files over the network to build distributed systems. Accessing any remote resources is easy and robust on Plan9, meanwhile on other systems we need to install specialized software with bad interoperability for each individual use case.
Plan9 also had some innovative UI features, such as mouse chording to edit text, nested window managers, the Plumber to run user-configurable commands on known text patterns system-wide, etc.
Its distributed nature should have meant it's perfect for today's world with mobile, desktop, cloud, and IoT devices all connected to each other. Instead, we're stuck with operating systems that were never designed for that.
There are still active forks of Plan9 such as 9front, but the original from Bell Labs is dead. The reasons it died are likely:
- Legal challenges (Plan9 license, pointless lawsuits, etc.) meant it wssn't adopted by major players in the industry.
- Plan9 was a distributed OS during a time when having a local computer became popular and affordable, while using a terminal to access a centrally managed computer fell out of fashion (though the latter sort of came back in a worse fashion with cloud computing).
- Bad marketing and posing itself as merely a research OS meant they couldn't capitalize on the .com boom.
- AT&T lost its near endless source of telephone revenue. Bell Labs was sold multiple times over the coming years, a lot of the Unix/Plan9 guys went to other companies like Google.
- MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.
- Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems.
- IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS.
- CPUs that really hypervise properly. That is, virtual execution environments look just like real ones. IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks.
- The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind.
- Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.
- XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?
- Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate.
XHTML appeals to the intuition that there should be a Strict Right Way To Do Things ... but you can't use that unforgiving framework for web documents that are widely shared.
The "real world" has 2 types of file formats:
(1) file types where consumers cannot contact/control/punish the authors (open-loop) : HTML, pdf, zip, csv, etc. The common theme is that the data itself is more important that the file format. That's why Adobe Reader will read malformed pdf files written by buggy PDF libraries. And both 7-Zip and Winrar can read malformed zip files with broken headers (because some old buggy Java libraries wrote bad zip files). MS Excel can import malformed csv files. E.g. the Citi bank export to csv wrote a malformed file and it was desirable that MS Excel imported it anyway because the raw data of dollar amounts was more important than the incorrect commas in the csv file -- and -- I have no way of contacting the programmer at Citi to tell them to fix their buggy code that created the bad csv file.
(2) file types where the consumer can control the author (closed-loop): programming language source code like .c, .java, etc or business interchange documents like EDI. There's no need to have a "lenient forgiving" gcc/clang compiler to parse ".c" source code because the "consumer-and-author" will be the same person. I.e. the developer sees the compiler stop at a syntax error so they edit and fix it and try to re-compile. For business interchange formats like EDI, a company like Walmart can tell the vendor to fix their broken EDI files.
XHTML wants to be in group (2) but web surfers can't control all the authors of .html so that's why lenient parsing of HTML "wins". XHTML would work better in a "closed-loop" environment such as a company writing internal documentation for its employees. E.g. an employee handbook can be written in strict XHTML because both the consumers and authors work at the same company. E.g. can't see the vacation policy because the XHTML syntax is wrong?!? Get on the Slack channel and tell the programmer or content author to fix it.
On the other hand, imagine a world where Chrome would slowly start to phase out its quirks modes. Something like a yellow address bar and a "Chrome cannot guarantee the safety of your data on this website, as the website is malformed" warning message. Turn it into a red bar and a "click to continue" after 10 years, remove it altogether after 20 years. Suddenly it's no longer that one weird customer who is complaining, but everyone - including your manager. Your mistakes are painfully obvious during development, so you have a pretty good incentive to properly follow the spec. You make a mistake on a prominent page and the CTO sees it? Well, guess you'll be adding an XHTML validator to your CI pipeline next week!
It is very tempting to write a lenient parser when you are just one small fish in a big ecosystem, but over time it will inevitably lead to the degradation of that very ecosystem. You need some kind of standards body to publish a validating reference parser. And like it or not, Chrome is big enough that it can act as one for HTML.
> We rank valid XHTML higher
It doesn’t even have to be true!
Amen. Postel’s Law was wrong:
https://datatracker.ietf.org/doc/html/rfc9413
We stop at the first sign of trouble for almost every other format, we do not need lax parsing for HTML. This has caused a multitude of security vulnerabilities and only makes it more difficult for pretty much everybody.
The attitude towards HTML5 parsing seemed to grow out of this weird contrarianism that everybody who wanted to do better than whatever Internet Explorer did had their head in the clouds and that the role of a standard was just to write down all the bugs.
I, for one, is kinda happy that XHTML is dead.
XHTML allows you to use XML and <bold> <italic> are just XML nodes with no schema. The correct form has been and will always be <b> and <i>. Since the beginning.
HTML is not a set of instructions that you follow. It’s a terrible format if you treat it that way.
You have things backwards. The Copland project was horribly mismanaged. Anybody at Apple who came up with a new technology got it included in Copland, with no regard to feature creep or stability. There's a leaked build floating around from shortly before the project was cancelled. It's extremely unstable and even using basic desktop functionality causes hangs and crashes. In mid-late 1996, it became clear that Copland would never ship, and Apple decided the best course of action was to license an outside OS. They considered options such as Solaris, Windows NT, and BeOS, but of course ended up buying NeXT. Copland wasn't killed to justify buying NeXT, Apple bought NeXT because Copland was unshippable.
It would kill the approachability of the language.
One of the joys of learning HTML when it tended to be hand-written was that if you made a mistake, you'd still see something just with distorted output.
That was a lot more approachable for a lot of people who were put off "real" programming languages because they were overwhelmed by terrible error messages any time they missed a bracket or misspelled something.
If you've learned to program in the last decade or two, you might not even realise just how bad compiler errors tended to be in most languages.
The kind of thing where you could miss a bracket on line 47 but end up with a compiler error complaining about something 20 lines away.
Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors.
But in the days of XHTML? Error messages were arcane, you had to dive in to see what the problem actually was.
What happens?
Even today, after years of better error messages, the strict validator at https://validator.w3.org/check says:
What is line 22? It's up to you to go hunting back through the document, to find the un-closed 'b' tag.Back in the day, the error messages were even more misleading than this, often talking about "Extra content at end of document" or similar.
Compare that to the very visual feedback of putting this exact document into a browser.
You get more bold text than you were expecting, the bold just runs into the next text.
That's a world of difference, especially for people who prefer visual feedback to reading and understanding errors in text form.
Try it for yourself, save this document to a .html file and put it through the XHTML validator.
Firefox displays naught but the error:
Chromium displays this banner on top of the document up to the error:Chromium is much more helpful in the error message, directing the user to both line 19 and 22. It also made the user-friendly choice to render up to the error.
In the context of XHTML, we should also keep in mind that Chrome post-dates XHTML by almost a decade.
Really, neither has particularly great handling of errors in anything XML. None of it is better than minimally maintained, a lot of it has simply been unmaintained for a decade or more.
If you appreciate Modula's design, take a look at Nim[1].
I remember reading the Wikipedia page for Modula-3[2] and thinking "huh, that's just like Nim" in every other section.
[1] https://nim-lang.org
[2] https://en.wikipedia.org/wiki/Modula-3
- I think without the move to NeXT, even if Jobs had come back to Apple, they would never have been able to get to the iPhone. iOS was - and still is - a unix-like OS, using unix-like philosophy, and I think that philosophy allowed them to build something game-changing compared to the SOTA in mobile OS technology at the time. So much so, Android follows suit. It doesn't have a command line, and installation is fine, so I'm not sure your line of reasoning holds strongly. One thing I think you might be hinting at though that is a missed trick: macOS today could learn a little from the way iOS and iPadOS is forced to do things and centralise configuration in a single place.
- I think transaction processing operating systems have been reinvented today as "serverless". The load/execute/quit cycle you describe is how you build in AWS Lambdas, GCP Cloud Run Functions or Azure Functions.
- Most of your other ideas (with an exception, see below), died either because of people trying to grab money rather than build cool tech, and arguably the free market decided to vote with its feet - I do wonder when we might next get a major change in hardware architectures again though, it does feel like we've now got "x86" and "ARM" and that's that for the next generation.
- XHTML died because it was too hard for people to get stuff done. The forgiving nature of the HTML specs is a feature, not a bug. We shouldn't expect people to be experts at reading specs to publish on the web, nor should it need special software that gatekeeps the web. It needs to be scrappy, and messy and evolutionary, because it is a technology that serves people - we don't want people to serve the technology.
This is not true. The reason it died was because Internet Explorer 6 didn’t support it, and that hung around for about a decade and a half. There was no way for XHTML to succeed given that situation.
The syntax errors that cause XHTML to stop parsing also cause JSX to stop parsing. If this kind of thing really were a problem, it would have killed React.
People can deal with strict syntax. They can manage it with JSX, they can manage it with JSON, they can manage it with JavaScript, they can manage it with every back-end language like Python, PHP, Ruby, etc. The idea that people see XHTML being parsed strictly and give up has never had any truth to it.
Honestly I'm disappointed the promised XHTML5 never materialized along side HTML5. I guess it just lost steam.
The HTML Standard supports two syntaxes, HTML and XML. All browsers support XML syntax just fine—always have, and probably always will. Serve your file as application/xhtml+xml, and go ham.
Probably not, but what would be the benefit of having more pages fail to render? If xhtml had been coupled with some cool features which only worked in xhtml mode, it might have become successful, but on its own it does not provide much value.
I think those benefits are quite similar to having more programs failing to run (due to static and strong typing, other static analysis, and/or elimination of undefined behavior, for instance), or more data failing to be read (due to integrity checks and simply strict parsing): as a user, you get documents closer to valid ones (at least in the rough format), if anything at all, and additionally that discourages developers from shipping a mess. Then parsers (not just those in viewers, but anything that does processing) have a better chance to read and interpret those documents consistently, so even more things work predictably.
It is like Windows jumping through hoops to support backwards compatibility even with buggy software. The interest of the customer is that the software runs.
Rhetorical question: Should the browser display page even if it is commented out?
There is some bar for what is expected to work.
If all browsers would consistently error out on unclosed tags, then it would definitely force developers to close tags, it would force it become common knowledge, second nature.
What if the browser renders it incorrectly? If a corrupt tag combination leads to browser X parsing "<script>" as inline text but browser Y parsing it as a script tag, that could lead to serious security issues!
Blindly guessing at the original author's intent whenever you encounter buggy content is a recipe for disaster. Sometimes it is to the user's benefit to just refuse to render it.
This was, maybe, true some 10 years ago. Now even old Windows programs (paint,wordpad) do not run on newer Windows
> The interest of the customer is that the software runs
Yes, but testing is expensive and we are Agile. /s
https://keenwrite.com/blog/2025/09/08/feature-matrix/
I don't know if you know it, that's a feature of Google Lens
Meanwhile, local files with the doctype would be treated as XHTML, so people assumed the doctype was all you needed. So everyone who tried to use XHTML didn't realize that it would go back to being read as HTML when they upload it to their webserver/return it from PHP/etc. Then, when something went wrong/worked differently than expected, the author would blame XHTML.
Edit: I see that I'm getting downvoted here; if any of this is factually incorrect I would like to be educated please.
None of that is correct.
It was perfectly spec. compliant to label XHTML as text/html. The spec. that covers this is RFC 2854 and it states:
> The text/html media type is now defined by W3C Recommendations; the latest published version is [HTML401]. In addition, [XHTML1] defines a profile of use of XHTML which is compatible with HTML 4.01 and which may also be labeled as text/html.
— https://datatracker.ietf.org/doc/html/rfc2854
There’s no spec. that says you need to parse XHTML served as text/html as HTML not XHTML. As the spec. says, text/html covers both HTML and XHTML. That’s something that browsers did but had no obligation to.
The mismatched doctype didn’t trigger quirks mode. Browsers don’t care about that. The prologue could, but XHTML 1.0 Appendix C told you not to use that anyway.
Even if it did trigger quirks mode, that makes no difference in terms of tag soup. Tag soup is when you mis-nest tags, for instance <strong><em></strong></em>. Quirks mode was predominantly about how it applied CSS layout. There are three different concepts being mixed up here: being parsed as HTML, parsing tag soup, and doctype switching.
The problem with serving application/xhtml+xml wasn’t anything to do with web servers. The problem was that Internet Explorer 6 didn’t support it. After Microsoft won the browser wars, they discontinued development and there was a five year gap between Internet Explorer 6 and 7. Combined with long upgrade cycles and operating system requirements, this meant that Internet Explorer 6 had to be supported for almost 15 years globally.
Obviously, if you can’t serve XHTML in a way browsers will parse as XML for a decade and a half, this inevitably kills XHTML.
People being too lazy to close the <br /> tag was apparently a gateway drug into absolute mayhem. Modern HTML is a cesspool. I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it. Is that part of the reason why we have so few browsers?
Your chronology is waaaaaaaaaaaay off.
<BR> came years before XML was invented. It was a tag that didn’t permit children, so writing it <BR></BR> would have been crazy, and inventing a new syntax like <BR// or <BR/> would have been crazy too. Spelling it <BR> was the obvious and reasonable choice.
The <br /> or <br/> spelling was added to HTML after XHTML had already basically lost, as a compatibility measure for porting back to HTML, since those enthusiastic about XHTML had taken to writing it and it was nice having a compatible spelling that did the same in both. (In XHTML you could also write <br></br>, but that was incorrect in HTML; and if you wrote <br /> in HTML it was equivalent to <br /="">, giving you one attribute with name "/" and value "". There were a few growing pains there, such as how <input checked> used to mean <input checked="checked">—it was actually the attribute name that was being omitted, not the value!—except… oh why am I even writing this, messy messy history stuff, engines doing their own thing blah blah blah, these days it’s <input checked="">.
Really, the whole <… /> thing is more an artefact of an arguably-misguided idea after a failed reform. The absolute mayhem came first, not last.
> I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it.
The HTML parser is magnificent, by far the best spec for something reasonably-sized that I know of. It’s exhaustively defined in terms of state machines. It’s huge, far larger than one would like it to be because of all this compatibility stuff, but genuinely easy to implement if you have the patience. Seriously, go read it some time, it’s really quite approachable.
This is untrue. This is the first public draft of XHTML from 1998:
> Include a space before the trailing / and > of empty elements, e.g. <br />, <hr /> and <img src="karen.jpg" alt="Karen" />.
— https://www.w3.org/TR/1998/WD-html-in-xml-19981205/#guidelin...
It’s far from perfect but I’ve been enjoying playing with it even for things that aren’t games and it has come a long way just in the last year or two. I feel like it’s close to (or is currently) having its Blender moment.
I still miss Macromedia Fireworks.
yeah it wasn't secure
but;
> bad performance
I don't think thats the case. For the longest while flash was faster than js at doing anything vaguely graphic based. The issue for apple was that the CPU in the iphone wasn't fast enough to do flash and anything else. Moreover Adobe didn't get on with jobs when they were talking about custom versions.
You have to remember that "apps" were never meant to be a thing on the iphone, it was all about "desktop" like web performance.
I agree on security and bugs, but bugs can be fixed. It just shows neglect by Adobe, which was, I think, the real problem. I think that if Adobe seriously wanted to, it could have been a web standard.
The 20 most common things you’d do with the tool were there for you in obvious toolbars. It had a lot of advanced features for image editing. It had a scripting language, so you could do bulk editing operations. It supported just about every file extension you could think of.
Most useful feature of all was that it’d load instantly. You’d click the icon on the desktop, and there’d be the Fireworks UI before you could finish blinking. Compared to 2025 Adobe apps, where you click the desktop icon and make a coffee while it starts, it’s phenomenal performance.
Adobe was never known for its security or quality.
There hasn’t been a replacement, yet.
On the other hand, for every flash game made there were about ten thousands flash-based ads, and nearly as many websites that used flash poorly for things like basic navigation (remember flash based website dropdown menus?). And for a few years it seemed like every single restaurant with a website was using flash for the entire thing, the results were borderline unusable in the best cases. And let's not forget that as long as flash was dominant, it was choking out the demand to get proper video support into browsers. Flash based video players performed like dog shit and made life on Linux a real chore.
They burned through $5B of 1999 dollars, building out a network in 23 cities, and had effectively zero customers. Finally shut down in 2001.
All their marketing was focused on "mobile professionals", whoever those were, while ignoring home users who were clamoring for faster internet where other ISPs dragged their feet.
Today, 5G femtocells have replicated some of the concept (radically small cell radius to increase geographic frequency reuse), but without the redundancy -- a femtocell that loses its uplink is dead in the water, not serving as a relay node. A Ricochet E-radio that lost its uplink (but still had power) would simply adjust its routing table and continue operating.
Edit: you asked why. I first saw it at SELF where Chris DiBona showed it to me and a close friend. It was awesome. Real time translation, integration of various types of messaging, tons of cool capabilities, and it was fully open source. What made it out of Google was a stripped down version of what I was shown, the market rejected it, and it was a sad day. Now, I am left with JIRA, Slack, and email. It sucks.
https://github.com/shano/Wave-ServerAdmin
It's been 16 years. I should probably archive this..
It make it seem needlessly complicated, and effectively erased all the positives.
In a sense Wave still exists but was split into multiple products, so I wouldn’t say it’s “dead”. The tech that powered it is still used today in many of Google’s popular products. It turns out that having separate interfaces for separate purposes is just more user friendly than an all-in-one.
Even the watered-down version of wave was something I used at my host startup, it was effectively our project management tool. And it was amazing at that.
I don't know how it would fare compared to the options available today, but back then, it shutting down was a tremendous loss.
VM's persist memory snapshots (as do Apple's containers, for macOS at least), so there's still room for something like that workflow.
The technology took decades to mature, but the business people didn’t have the patience to let the world catch up to this revolutionary technology.
Kinda, but for small writes it's still nowhere near.
Samsung 990 Pro - IOPS 4KQD1 113 MBytes/Sec
P4800X optane - IOPS 4KQD1 206 MBytes/Sec
And that's a device 5 years newer and on a faster pcie generation.
It disappeared because the market that values above attribute is too small and its hard to market because at first glance they look about the same on a lot of metrics as you say
We were about get rid of split between RAM and disk memory and use single stick for both!
Isn't windows fast boot something like that (only slower, depending on ssd)? It semi-hibernates, stores kernel part of memory on disk for faster startup.
Optane was nearly as fast as RAM, but also persistent like a storage device. So you do a suspend to RAM, without the requirement to keep it powered like a RAM.
A few more thoughts about that, since I happen to have some of the last systems who actually had systems level support for that in their firmware, and early low-capacity optanes designed for that sort of use. It's fascinating to play with these, but they are low capacity, and bound to obsolete operating systems.
Given enough RAM, you can emulate that with working suspend and resume to/and from RAM.
Another avenue are the ever faster and larger SSDs, in practice, with some models it makes almost no difference anymore, since random access times are so fast, and transfer speeds insane. Maybe total and/or daily TBW remains a concern.
Both of these can be combined.
Google Picasa: Everything local, so fast, so good. I'm never going to give my photos to G Photos.
Google Hangouts: Can't keep track of all the Google chat apps. I use Signal now.
Google G Suite Legacy: It was supposed to be free forever. They killed it, tried to make me pay. I migrated out of Google.
Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.
Google Finance: Tracked my stocks and funds there. Then they killed it. Won't trust them with my data again.
Google NFC Wallet: They killed it. Then Apple launched the same thing, and took over.
Google Chromecast Audio: It did one thing, which is all I needed. Sold mine as soon as they announced they were killing it.
Google Chromecast: Wait, they killed Chromecast? I did not know that until I started writing this..
Killing Google Reader affected a relatively small number of users, but these users disporportionately happened to be founders, CTOs, VPs of engineering, social media luminaries, and people who eventually became founders, CTOs, etc. They had been painfully taught to not trust Google, and, since that time, they didn't. And still don't.
They had a core set of ultra-connected users who touched key aspects of the entire tech industry. The knowledge graph you could have built out of what those people read and shared…
They could have just kept the entire service running with, what, 2 software engineers? Such a waste.
You can argue whether it's as good as GPM or not, but it's false to imply that your uploaded music disappeared when Google moved to YouTube Music. I made the transition, and all of my music moved without a new upload.
I also need to sell my Google Chromecast with Google TV 4K. Brand new, still in its shrink wrap. Bought it last year, to replace a flaky Roku. It was a flaky HDMI cable instead. I trust Roku more than Google for hardware support.
I genuinely thought all the chromecast audios I owned were useless bricks and was looking around for replacements and then they just started working again from an OTA update. Astounding. I assume someone got fired for taking time away from making search worse to do this.
(edit: https://www.techradar.com/televisions/streaming-devices/goog...)
Of course another question how long they will honor that commitment.
Google killed a lot of things to consolidate them into more "integrated" (from their perspective) product offerings. Picasa -> Photos, Hangounts -> Meet, Music -> YT Premium.
No idea what NFC Wallet was, other than the Wallet app on my phone that still exists and works?
The only one I'm not sure about is Chromecast - a while back my ones had an "update" to start using their newer AI Assistant system for managing it. Still works.
Two ways. Gradually, then suddenly.
- Ernest Hemingway, The Sun Also Rises
Unfortunately the last public version has a bug that randomly swaps face tags, so you end up training on the wrong persons faces just enough to throw it all off, and the recognition becomes effectively worthless on thousands of family photos. 8(
Digikam is a weak sauce replacement that barely gets the job done.
Is there another app where I can store this locally?
The difference is they no longer store the data on their servers, it's stored on your phone (iPhone/Android)
https://support.google.com/maps/answer/6258979
That way, they can't respond to requests for that data by governments as they don't have it.
I can look on my phone and see all the places I've been today/yesterday, etc
Using it on daily basis
Edit: Missed the "locally" part. Sorry no suggestions. Maybe Garmin has something?
https://support.google.com/youtubemusic/answer/9716522
still have many domains on there, all with gmail
Which particular thing called Hangouts? There were at least two, frankly I’d say more like four.
Google and Microsoft are both terrible about reusing names for different things in confusing ways.
> Can't keep track of all the Google chat apps.
And Hangouts was part of that problem. Remember Google Talk/Chat? That was where things began, and in my family we never wanted Hangouts, Talk/Chat was better.
Allo, Chat, Duo, Hangouts, Meet, Messenger, Talk, Voice… I’ve probably forgotten at least two more names, knowing Google. Most of these products have substantial overlap with most of the rest.
I don't like the thought of providing Google thousands of personal photos for their AI training. Which will eventually leak to gov't agencies, fraudsters, and criminals.
Why didn’t you quit Google after, say, the third product you used got canned?
Hangouts had trouble scaling to many participants. Google Meet is fine, and better than e.g. MS Teams.
Legacy suite, free forever? Did they also promise a pony?..
Play Music: music is a legal minefield. Don't trust anybody commercial who suggests you upload music you did not write yourself.
Finance: IDK, I still get notifications about the stocks I'm interested in.
NFC Wallet: alive and kicking, I use it literally every day to pay for subway.
Can't say anything about Chromecast. I have a handful of ancient Chromecasts that work. I don't want any updates for them.
(It’s not super obvious, especially on mobile, but once you see the site, just scroll down to see the content)
Then all those sites I used to post on stopped supporting rss one by one and finally pipes was killed off.
For a while I used a python library called riko that did the same thing as pipes without the visual editor. I have to thank it for getting me off php and into python.
https://github.com/nerevu/riko
It has the advantage of being open source, has well defined and stable APIs and a solid backend. Plus 10+ years of constant development with many learnings around how to implement flow based programming visually.
I used the Node-RED frontend to create Browser-Red[^2] which is a Node-RED that solely executes in the browser, no server required. It does not support all Node-RED functionality but gives a good feel for using Node-RED and flow based programming.
The second project with which I am using Node-RED frontend is Erlang-Red[^3] which is Node-RED with an Erlang backend. Erlang is better suited to flow based programming than NodeJS, hence this attempt to demonstrate that!
Node-RED makes slightly different assumptions than Yahoo! Pipes - input ports being the biggest: all nodes in Node-RED have either zero or one input wires, nodes in Yahoo! Pipes had multiple input wires.
A good knowledge of jQuery is required but that makes it simpler to get into the frontend code - would be my argument ;) I am happy to answer questions related to Node-RED, email in bio.
[^1]: https://nodered.org
[^2]: https://cdn.flowhub.org
[^3]: https://github.com/gorenje/erlang-red
I don't know if it was Yahoo Pipes that died, or a mainstream internet based on open protocols and standards.
Apache Karavan: https://karavan.space/ Kaoto (Red Hat): https://kaoto.io
Both are end 2 end usable within vscode.
https://www.mashups.io
Crazy fast compiler so doesn't frustrate trial & erroring students, decent type system without the wildness of say rust and all the basic programming building blocks you want students to grasp are present without language specific funkiness.
[1] https://en.wikipedia.org/wiki/Midori_%28operating_system%29
I've heard someone at Microsoft describe it as a moonshot but also a retention project; IIRC it had a hundred plus engineers on it at one time, including a lot of very senior people.
Apparently a bunch of research from Midori made it into .NET so it wasn't all lost, but still...
Never heard this phrase before, but I can definitely see this happening at companies of that size
The creator, kentonv (on HN), commented about it recently here https://news.ycombinator.com/item?id=44848099
I often wonder, if AI had come 15 years earlier, would it have been a ton better because there weren't a billion different ways to do things? Would we have ever bothered to come up with all the different tech, if AI was just chugging through features efficiently, with consistent training data etc.?
Sounds not that different from containers, if you just choose the most popular tooling.
Small projects: docker compose, posgres, redis, nginx
Big projects: kubernetes, posgres, redis, nginx
This is why Heroku lost popularity.
- not lowering prices as time went off. They probably kept a super-huger margin profit, but they’re largely irrelevant today
- not building their own datacenters and staying in aws. That would have allowed them to lower prices and gain even more market share. Everyone that has been in amazon/aws likely has seen the internal market rate for ec2 instances and know there’s a HUGE profit margin deriving by building datacenters. Add the recent incredible improvements to compute density (you can easily get 256c/512t and literally terabytes of memory in a 2u box) and you get basically an infinite money glitch.
10+ years ago I'd regularly build all sorts of little utilities with it. It was surprisingly easy to use it to tap into things that are otherwise a lot more work. For instance I used it to monitor the data coming from a USB device. Like 3 nodes and 3 patches to make all of that work. Working little GUI app in seconds.
Apple hasn't touched it since 2016, I kind of hope it makes a comeback given Blender and more so Unreal Engine giving people a taste of the node based visual programming life.
You can still download it from Apple, and it still technically works but a lot of the most powerful nodes are broken in the newer OS's. I'd love to see the whole thing revitalized.
https://wikipedia.org/wiki/Kuro5hin
I was a hold out on smartphones for a while and I used to print out k5 articles to read while afk... Just such an amazing collection of people sharing ideas and communal moderation, editing and up voting.
I learned about so many wierd and wonderful things from that site.
The internet before advertising, artificial intelligence, social media and bots. When folks created startups in their bedrooms or garages. The days when google slogan was “don’t be evil”.
Communities are moving back to early Internet-like chatrooms like IRC, but now it is Slack, Discord, and the like. Everything private.
I don't like the siloing our information to Discord being a comparison to old internet. We had indexable information in forums that is "lost", not in the literal sense, but because you wouldn't be able to find it without obsessive digging to find it again. Conversations in Discord communities are very surface level and cyclical because it's far less straight forward to keep track of and link to answers from last week let alone two years ago. It is profoundly sad, to be honest.
Animated gifs of cat, banner bars and pixels cost one dollar, until a one million were sold.
And it all ran on Chuck Norris' personal computer.
This would have changed so much. Desktop apps powered by the engine of Firefox not Chrome.
Why? Not enough company buy in, not enough devs worked on it. Maybe developed before a major Firefox re-write?
Tauri apps take advantage of the web view already available on every user’s system. A Tauri app only contains the code and assets specific for that app and doesn’t need to bundle a browser engine with every app.
Rendering will still use Edge/Chromium on a generic Windows machine.
It has been in existence in some form or another for nearly 30 years, but did not gain the traction it needed and as of writing it's still not in a usable state on real hardware. It's not abandoned, but progress on it is moving so slow that I doubt we'll ever see it be released in a state that's useful for real users.
It's too bad, because a drop in Windows replacement would be nice for all the people losing Windows 10 support right now.
On the other hand, I think people underestimate the difficulty involved in the project and compare it unfavorably to Linux, BSD, etc. Unix and its source code was pretty well publicly documented and understood for decades before those projects started, nothing like that ever really existed for Windows.
I don't think people do, it sounds like a nearly impossible struggle, and at the end you get a Windows clone. I can't imagine hating yourself enough to work on it for an extended period of time for no money and putting yourself and your hard work in legal risk. It's a miracle we have Wine and serious luck that we have Proton.
People losing Windows 10 support need to move on. There's Linux if you want to be free, and Apple if you still prefer to be guided. You might lose some of your video games. You can still move to Windows 11 if you think that people should serve their operating systems rather than vice versa.
Some projects creep along slowly until something triggers an interest and suddenly they leap ahead.
MAME's Tandy 2000 implementation was unusable, until someone found a copy of Windows 1.0 for the Tandy 2000, then the emulation caught up until Windows ran.
Maybe ReactOS will get a big influx of activity after Windows 10 support goes offline in a couple days, or even shortly after when you can't turn AI spying off, not even three times a year.
And yet, no big leap in ReactOS (at least for now).
The project is supposed to be a clean-room reverse engineering effort. If you even see Windows code, you are compromised, and should not work on ReactOS.
Apparently copyright law only applies for humans, generative AI gets away with stealing because there is too much monetary interest involved in looking the other way.
I think nostalgia is influencing this opinion quite a bit, and we don't realize the mountain of tiny usability improvements that have been made since XP
Full C# instead of god forbidden js.
Full vector dpi aware UI, with grid, complex animation, and all other stuff that html5/css didn’t have in 2018 but silverlight had even in 2010 (probable even earlier).
MVVM pattern, two-way bindings. Expression Blend (basically figma) that allowed designers create UI that was XAML, had sample data, and could be used be devs as is with maybe some cleanup.
Excellent tooling, static analysis, debugging, what have you.
Rendered and worked completely the same in any browser (safari, ie, chrome, opera, firefox) on mac and windows
If that thing still worked, boy would we be in a better place regarding web apps.
Unfortunately, iPhone killed adobe flash and Silverlight as an aftermath. Too slow processor, too much energy consumption.
Why do you think JavaScript is a problem? And a big enough problem to risk destroying open web standards.
TypeScript exists for the same reason things like mypy exists, and no one in their right mind claims that python's openness should be threatened just because static typing is convenient.
I suppose JS could go in the same direction and adopt the typing syntax from TS as a non-runtime thing. Then the typescript compiler would become something like mypy, an entirely optional part of the ecosystem.
Stuff like angularjs was basically created for the same reason flash/silverlight went down — iphone
> A remote code execution vulnerability exists when Microsoft Silverlight decodes strings using a malicious decoder that can return negative offsets that cause Silverlight to replace unsafe object headers with contents provided by an attacker. In a web-browsing scenario, an attacker who successfully exploited this vulnerability could obtain the same permissions as the currently logged-on user. If a user is logged on with administrative user rights, an attacker could take complete control of the affected system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights. Users whose accounts are configured to have fewer user rights on the system could be less impacted than users who operate with administrative user rights.
https://learn.microsoft.com/en-us/security-updates/securityb...
Lots of their stuff was delivered as Silverlight apps. It turns out that getting office workers to install a blessed plugin from Microsoft and navigate to a web page is much easier than distributing binaries that you have to install and keep up to date. And developing for it was pure pleasure; you got to use C# and Visual Studio, and a GUI interface builder, rather than the Byzantine HTML/JS/CSS ecosystem.
I get why it never took off, but in this niche of small-time custom software it was really way nicer than anything else that existed at the time. Web distribution combined with classic desktop GUI development.
[0] https://neocities.org/
I tried DDG (Bing-backed, I believe) and it happily found everything with no manual intervention at all. That was the point where I ditched Google Search after 30 years.
tumblr will practically let you do that for chrissake
You could have said Wordpress.com or something. It's not quite a website, but it's close. It's also probably going to be Typepad (i.e. defunct) in a few years and Blogger is probably going to be there quicker than that.
And that's precisely why companies nerf their web sites and put a little popup that says "<service> works better on the app".
Apple would have inevitably done their own thing, but it would have been really nice to have two widely used, mature and open mobile Linux platforms.
It looked a bit goofy in the promo videos, but under the hood it was doing real-time chord detection and accompaniment generation. Basically a prototype of what AI music tools like Suno, Udio, or Mubert are doing today, fifteen years too early.
If Microsoft had kept iterating on it with modern ML models, it could’ve become the "GarageBand for ideas that start as a hum."
It was a series of experiments with new approaches to programming. Kind of reminded me of the research that gave us Smalltalk. It would have been interesting to see where they went with it, but they wound down the project.
LT was cool, but they abandoned it with insufficient hand-off when it was 80-90% done to work on Eve.
I know a bunch of people were unhappy that LightTable wasn't finished, especially because they raised money via Kickstarter for it.
Maybe Eve was too ambitious. Maybe funding never materialized. Maybe they just got bored and couldn't finish. Maybe they pissed off their audience.
Instead it went chasing markets, abandoning existing users as it did so, in favour of potential larger pools of users elsewhere. In the end it failed to find a niche going forward while leaving a trail of abandoned niches behind it.
I noticed the trend when I was working on a major web property for the Aditya Birla conglomerate. My whole team was pleasantly surprised, and we made sure to test everything in Firefox for that project. But everyone switched to Android + Chrome over the next few years, which was a shame.
Today, India is 90% Chrome :(
Luckily it wasn't long after Mozilla abandoned it that PWAs were introduced and I could port the apps I cared about.
That’s actually an incredibly cool concept.
In 2011, before TypeScript, Next.js or even React, they had seamless server-client code, in a strongly typed functional language with support for features like JSX-like inline HTML, async/await, string interpolation, built-in MongoDB ORM, CSS-in-JS, and many syntax features that were added to ECMAScript since then.
I find it wild how this project was 90%+ correct on how we will build web apps 14 years later.
https://non.tuxfamily.org
I used it quite a bit to produce radio shows for my country's public broadcasting. Because Non's line-oriented session format was so easy to parse with classic Unix tools, I wrote a bunch of scripts for it with Awk etc. (E.g. calculating the total length of clips highlighted with brown color in the DAW -- which was stuff meant for editing out; or creating a poor man's "ripple editing" feature by moving loosely-placed clips precisely side by side; or, eventually, converting the sessions to Samplitude EDL format, and, from there, to Pro Tools via AATranslator [1] (because our studio was using PT), etc. Really fun times!)
1: https://aatranslator.com.au/
- Based on BitTorrent ideas
- Completely decentralized websites' code and data
- Either completely decentralized or controllable-decentralized authentication
- Could be integrated into existing websites (!)
It's not kind of dead, there's a supported fork, but it still feels like a revolution that did not happen. It works really well.
Interesting how Flash became the almost universal way to play videos in the browser, in the latter half of the 2000's (damn I'm old...).
https://en.wikipedia.org/wiki/Adobe_Edge
I wonder why one one has managed to build something comparable that does work on a phone.
Maybe they could have fixed all that for touch screens, small portrait screens, and more but they never did make it responsive AFAIK.
Are you referring to the SWF file format?
[1]: https://en.wikipedia.org/wiki/Project_Ara
Edit: in fact I'd say they were irrelevant before pretty much all of those innovations. By the time AIM or MSN Messenger really became popular, ICQ didn't matter anymore.
A genius product ripped my Microsoft. Have you used Microsoft Teams recently? Bad UI, hard to configure external hardware and good level of incompatibility, missing the good old "Echo / Sound Test Service". At a point I even installed Skype of my old Android but was sucking up too much battery.
BT had this grand vision for basically providing rich multi-media through the phone line, but in ~1998. Think a mix of on-demand cable and "teleconferencing" with TV based internet (ceefax/red button on steriods)
It would have been revolutionary and kick started the UK's jump into online rich media.
However it wouldnt have got past the regulators as both sky and NTL(now virgin) would have protested loudly.
1. competing visions for how the entire system should work
2. dependence on early/experimental npm libraries
3. devs breaking existing features due to "innovation"
4. a lot of interpersonal drama because it was not just open source but also a social network
the ideas are really good, someone should make the project again and run with it
It was a fascinating protocol underneath, but the social follow structure seemed to select strongly for folks who already had a following or something.
Having seen what goes on in the foss world and what goes on in the large faang-size corporate world, no wonder the corporate world is light-years ahead.
Those people need to be pushed out early and often. That's what voting is for. You need a supermajority to force an end to discussion, and a majority to make a decision. If you hold up the discussion too long with too slim a minority, the majority can fork your faction out of the group. If the end of debate has been forced, and you can't work with the majority, you should leave yourself.
None of this letting the bullies get their way until everything is a disaster, then splitting up anyway stuff.
[1] https://austral-lang.org/ [2] https://austral-lang.org/spec/spec.html
It's at a very early stage of development but looks promising
There are lots of competing MLs you can use instead:
- F# (Fable)
- ReasonML
- OCaml (Bucklescript)
- Haskell
- PureScript
IMO the problem with Elm was actually The Elm Architecture.
https://guide.elm-lang.org/architecture/
I'm no frontend guy, but I think it did/was inspire(d) react (redux?) maybe. Corrections on this very welcome
In comparison I found Clojars^[0] for Clojure better and community driven like NPM. But obv Clojure has more business adoption than CL.
Do you use CL for work?
[0]: https://clojars.org/
(For those unfamiliar, Illustrator is a pure vector graphics editor; once you rasterize its shapes, they become uneditable fixed bitmaps. Fireworks was a vector graphics editor that rendered at a constant DPI, so it basically let you edit raster bitmaps like they were vectors. It was invaluable for pixel-perfect graphic design. Nothing since lets you do that, though with high-DPI screens and resolution-independent UIs being the norm these days, this functionality is less relevant than it used to be.)
Just barely stopped using my CS6 copy. Still haven't found anything as intuitive.
I’m not arguing the solutions it outlined are good, but I think some more discussion around how we interact with touch screens would be needed. Instead, we are still typing on a layout that was invented for mechanical typewriters - in 2025, on our touch screens.
https://youtu.be/zWz1KbknIZk?si=LWGsLQjFTWBOvzN-
https://en.wikipedia.org/wiki/IGoogle
https://en.wikipedia.org/wiki/Google_Desktop
and why? = UI/UX
At its best, having IM, email, browser, games, keywords, chats, etc. was a beautiful idea IMO. That they were an ISP seemed secondary or even unrelated to the idea. But they chose to charge for access even in the age of broadband, and adopt gym level subscription tactics to boot, and people decided they'd rather not pay it which is to be expected. I often wonder if they'd have survived as a software company otherwise.
They were basically a better thought out Facebook before Facebook, in my opinion.
You could purposely choose to be online or offline.
Much easier to draw a line back then about how often you were online.
The idea that you could read and write data at RAM speeds was really exciting to me. At work it's very common to see microscope image sets anywhere from 20 to 200 GB and file transfer rates can be a big bottleneck.
Archive capture circa 2023: https://web.archive.org/web/20230329173623/https://ddramdisk...
HN post from 2023: https://news.ycombinator.com/item?id=35195029
What to do with it, once it's there, is a concern of software, but specialized hardware is needed to get it there.
https://en.wikipedia.org/wiki/Zram
https://wiki.archlinux.org/title/Zram
https://wiki.gentoo.org/wiki/Zram
for most purposes. (Assuming the host has enough RAM to spare, to begin with)
[0]: https://en.wikipedia.org/wiki/Tiny_Thief
> TUNES started in 1992-95 as an operating system project, but was never clearly defined, and it succumbed to design-by-committee syndrome and gradually failed. Compared to typical OS projects it had very ambitious goals, which you may find interesting.
[1] http://tunes.org/
If something like that existed today, powered by modern APIs and AI, it could become the ultimate no-code creativity playground.
Connect your phone to a display, mouse, keyboard and get a full desktop experience.
At the time smartphones were not powerful enough, cables were fiddly (adapters, HDMI, USB A instead of a single USB c cable) and virtualization and containers not quite there.
Today, going via pkvm seems like promising approach. Seamless sharing of data, apps etc. will take some work, though.
Also this: https://news.ycombinator.com/item?id=6676494
Redmart (Singapore): Best web based online store to this date (obviously personal view). No one even tries now that mobile apps have won.
https://techcrunch.com/2016/11/01/alibaba-lazada-redmart-con...
People talk so much about how you need to write code that fits well within the rest of the codebase, but what tools do we have to explore codebases and see what is connected to what? Clicking through files feels kind of stupid because if you have to work with changes that involve 40 files, good luck keeping any of that in your working memory. In my experience, the JetBrains dependency graphs also aren't good enough.
Sourcetrail was a code visualization tool that allowed you to visualize those dependencies and click around the codebase that way, see what methods are connected to what and so on, thanks to a lovely UI. I don't think it was enough alone, but I absolutely think we need something like this: https://www.dbvis.com/features/database-management/#explore-... but for your code, especially for codebases with hundreds of thousands or like above a million SLoC.
Example: https://github.com/CoatiSoftware/Sourcetrail/blob/master/doc...
Another example: https://github.com/CoatiSoftware/Sourcetrail/blob/master/doc...
I yearn to some day view entire codebases as graphs with similarly approachable visualization, where all the dependencies are highlighted when I click an element. This could also go so, so much further - you could have a debugger breakpoint set and see the variables at each place, alongside being able to visually see how code is called throughout the codebase, or hell, maybe even visualize every possible route that could be taken.
What was the bookmarks social tool called from 00’s? I loved it and it fell off the earth. You could save your bookmarks, “publish” them to the community, share etc..
What ever happened to those build your own homepage apps like startpage (I think)? I always thought those would take off
del.icio.us! Funnily, also killed by yahoo like flickr
I just wanna make a mostly static site with links in and out of my domain. Maybe a light bit of interactivity for things like search that autocompletes.
https://www.youtube.com/watch?v=e5wAn-4e5hQ
https://www.youtube.com/watch?v=QWsNFVvblLw
Summary:
>This presentation introduces Via, a virtual file system designed to address the challenges of large game downloads and storage. Unlike cloud gaming, which suffers from poor image quality, input latency, and high hosting costs, Via allows games to run locally while only downloading game data on demand. The setup process is demonstrated with Halo Infinite, showing a simple installation that involves signing into Steam and allocating storage space for Via's cache.
>Via creates a virtual Steam library, presenting all owned games as installed, even though their data is not fully downloaded. When a game is launched, Via's virtual file system intercepts requests and downloads only the necessary game content as it's needed. This on-demand downloading is integrated with the game's existing streaming capabilities, leveraging features like level-of-detail and asset streaming. Performance metrics are displayed, showing download rates, server ping, and disk commit rates, illustrating how Via fetches data in real-time.
>The system prioritizes caching frequently accessed data. After an initial download, subsequent play sessions benefit from the on-disk cache, significantly reducing or eliminating the need for network downloads. This means the actual size of a game becomes less relevant, as only a portion of it needs to be stored locally. While server locations are currently limited, the goal is to establish a global network to ensure low ping. The presentation concludes by highlighting Via's frictionless user experience, aiming for a setup so seamless that users are unaware of its presence. Via is currently in early access and free to use, with hopes of future distribution partnerships.
I'm amazed the video still has under 4,000 views. Sadly, Flaherty got hired by XAI and gave up promoting the project.
https://x.com/rflaherty71/status/1818668595779412141
But I could see the technology behind it working wonders for Steam, Game Pass, etc.
I don't see how this could take off. Internet speeds are getting quicker, disk space is getting cheaper, and this will slow down load times. And what's worse is the more you need this tech the worse experience you have.
One always must define a one sentence goal or purpose, before teams think about how to build something.
Cell processors, because most coders can't do parallelism well
Altera consumer FPGA, as they chose behavioral rather than declarative best practices... then the Intel merger... metastability in complex systems is hard, and most engineers can't do parallelism well...
World Wide Web, because social-media and Marketers
Dozens of personal projects, because sometimes things stop being fun. =3
Just on principle, I'd have liked to see it on the market for more than 49 days! It pains me as an engineer to think of the effort to bring a hardware device to market for such a minuscule run.
A place where artists and consumers could freely communicated and socialize without hazzle.
Died because of: Stupidity, commercialisation and walled-gardening.
Died due to legal wranglings about patents, iirc.
More here:https://news.ycombinator.com/item?id=45061680
Unfortunately, it died because its very niche and also they couldnt keep up with development of drivers for desktops.. This is even worse today...
http://opalang.org/
I think the market was still skeptical about nodejs on the server at the time but other than that I don’t really know why it didn’t take off
That said, frameworks were all the buzz back in the day, so the language alone probably wouldn't have gone anywhere without it.
All of the upside and none of the downside of react
No JSX and no compiler, all native js
The main dev is paid by microsoft to do oss rust nowadays
I use choo for my personal projects and have used it twice professionally
https://github.com/choojs/choo#example
The example is like 25 lines and introduces all the concepts
Less moving parts than svelte
For example, Haunted is a react hooks implementation for lit: https://github.com/matthewp/haunted
Choo suffered from not having an ecosystem, same with mithtil and other "like react but not" also-rans.
I kind of expect we might see something similar if the AI bubble pops
I wonder who owns the domain now
https://en.wikipedia.org/wiki/Wuala
Why? Obviously close-to-zero market. It was unbelievable how those people though those projects would even succeed.
i first learned about it when i was working in an university group and had the task to transform a windowing algorithm already working on matlab to python. it felt like a modern linter and lsp with additional support through machine learning. i don't quite know why it got comparative small recognition, but perhaps enough to remain an avantgarde pioneering both python and machine learning support for further generations and wider applications.
(Not the Linux distribution with the same name)
I have used it for years.
A two pane manager, it makes defining file associations, applications invoked by extensions and short cut buttons easy convenient.
Sadly it is abandonware now.
Slowly migrating to Double Commander now...
In the end I wound up with basically the same application software as on my Debian desktop, except running on Haiku instead of Linux. Haiku is noticeably snappier and more responsive than Linux+X+Qt+KDE, though.
1. When Windows Vista was being developed, there were plans to replace the file system with a database, allowing users to organize and search for files using database queries. This was known as WinFS (https://en.wikipedia.org/wiki/WinFS). I was looking forward to this in the mid-2000s. Unfortunately Vista was famously delayed, and in an attempt to get Vista released, Microsoft pared back features, and one of these features was WinFS. Instead of WinFS, we ended up getting improved file search capabilities. It's unfortunate that there's been no proposals for database file systems for desktop operating systems since.
2. OpenDoc (https://en.wikipedia.org/wiki/OpenDoc) was an Apple technology from the mid-1990s that promoted component-based software. Instead of large, monolithic applications such as Microsoft Excel and Adobe Photoshop, functionality would be offered in the form of components, and users and developers can combine these components to form larger solutions. For example, as an alternative to Adobe Photoshop, there would be a component for the drawing canvas, and there would be separate components for each editing feature. Components can be bought and sold on an open marketplace. It reminds me of Unix pipes, but for GUIs. There's a nice promotional video at https://www.youtube.com/watch?v=oFJdjk2rq4E.
OpenDoc was a radically different paradigm for software development and distribution, and I think this was could have been an interesting contender against the dominance that Microsoft and Adobe enjoys in their markets. OpenDoc actually did ship, and there were some products made using OpenDoc, most notably Apple's Cyberdog browser (https://en.wikipedia.org/wiki/Cyberdog).
Unfortunately, Apple was in dire straits in the mid-1990s. Windows 95 was a formidable challenger to Mac OS, and cheaper x86 PCs were viable alternatives to Macintosh hardware. Apple was an acquisition target; IBM and Apple almost merged, and there was also an attempt to merge Apple with Sun. Additionally, the Macintosh platform depended on the availability of software products like Microsoft Office and Adobe Photoshop, the very types of products that OpenDoc directly challenged. When Apple purchased NeXT in December 1996, Steve Jobs returned to Apple, and all work on OpenDoc ended not too long afterward, leading to this now-famous exchange during WWDC 1997 between Steve Jobs and an upset developer (https://www.youtube.com/watch?v=oeqPrUmVz-o).
I don't believe that OpenDoc fits in with Apple's business strategy, even today, and while Microsoft offers component-based technologies that are similar to OpenDoc (OLE, COM, DCOM, ActiveX, .NET), the Windows ecosystem is still dominated by monolithic applications.
I think it would have been cool had the FOSS community pursued component-based software. It would have been really cool to apt-get components from remote repositories and link them together, either using GUI tools, command-line tools, or programmatically to build custom solutions. Instead, we ended up with large, monolithic applications like LibreOffice, Firefox, GIMP, Inkscape, Scribus, etc.
3. I am particularly intrigued by Symbolics Genera (https://en.wikipedia.org/wiki/Genera_(operating_system)), an operating system designed for Symbolics Lisp machines (https://en.wikipedia.org/wiki/Symbolics). In Genera, everything is a Lisp object. The interface is an interesting hybrid of early GUIs and the command line. To me, Genera could have been a very interesting substrate for building component-based software; in fact, it would have been far easier building OpenDoc on top of Common Lisp than on top of C or C++. Sadly, Symbolics' fortunes soured after the AI winter of the late 1980s/early 1990s, and while Genera was ported to other platforms such as the DEC Alpha and later the x86-64 via the creation of a Lisp machine emulator, it's extremely difficult for people to obtain a legal copy, and it was never made open source. The closest things to Genera we have are Xerox Interlisp, a competing operating system that was recently made open source, and open-source descendants of Smalltalk-80: Squeak, Pharo, and Cuis-Smalltalk.
4. Apple's "interregnum" years between 1985 and 1996 were filled with many intriguing projects that were either never commercialized, were cancelled before release, or did not make a splash in the marketplace. One of the most interesting projects during the era was Bauhaus, a Lisp operating system developed for the Newton platform. Mikel Evins, a regular poster here, describes it here (https://mikelevins.github.io/posts/2021-07-12-reimagining-ba...). It would have been really cool to have a mass-market Lisp operating system, especially if it had the same support for ubiquitous dynamic objects like Symbolic Genera.
For anyone interested in the Apple future that could have been, check out Jim Miller's articles, e.g. on LiveDoc (https://www.miramontes.com/writing/livedoc/index.php)
MIME types for mail addressed much of the demand for pluggable data types.
They stopped supporting small tablets some years ago though, and made it worse with every Windows update. I can only surmise that it was to make people stop using them. Slow GUI, low contrast, killed apps.
Dual screen iPad killer, productivity optimised. IIRC Microsoft OneNote is its only legacy.
Killed because both the Windows team and the Office team thought it was stepping on their toes.
Instead it went into a slow death spiral due to Windows 95.
Got a good family story about that whole acquisition attempt, but I don't want to speak publicly on behalf of my uncle. I know we've talked at length about the what-ifs of that moment.
I do have a scattering of some neat Quarterdeck memorabilia I can share, though:
https://www.dropbox.com/scl/fo/0ca1omn2kwda9op5go34e/ACpO6bz...
But you couldn't actually buy /X. After trying to buy a copy, my publisher even contacted DESQ's marketing people to get a copy for me, and they wouldn't turn one over. Supposedly there were some copies actually sold, but too few, too late, and then /X was dropped. There was at least one more release of plain DESQview after that, but by then Windows was eating its lunch.
ISO/OSI had session layer. ie much of what QUIC does regarding underlying multiple transports.
Speaking of X.509 the s-expressions certificate format was more interesting in many ways.
X.400 was a nice idea, but the ideal of having a single global directory predates security. I can understand why it never happened
On X.509, the spec spends two chapters on attribute certificates, which I've never seen used in the wild. It's a shame; identity certificates do a terrible job at authentication
Looked cool during demos. Got killed when Flash died.
Javascript/HTML based smartphone / app interface.
Their execution was of course bad but I think today current LLM models are better and faster and there is much more OSS models to reduce costs. Hardware though looked nice and pico projector interesting concept even though not the best executed.
I wrote a bunch of software in Borland Delphi, which ran in Windows, Wine, and ReactOS with no problems. Well, except for ReactOS' lack of printing support.
As long as you stay within the ECMA or published Windows APIs, everything runs fine in Wine and ReactOS. But Microsoft products are full of undocumented functions, as well as checks to see if they're running on real Windows. That goes back to the Windows 3.1 days, when 3.1 developers regularly used OS/2 instead of DOS, and Microsoft started adding patches to fail under OS/2 and DR-DOS. So all that has to be accounted for by Wine and ReactOS. A lot of third-party software uses undocumented functions as well, especially stuff written back during the days when computer magazines were a thing, and regularly published that kind of information. A lot of programmers found the lure of undocumented calls to be irresistible, and they wound up in all kinds of commercial applications where they really shouldn't have been.
In my experience anything that will load under Wine will run with no problems. ReactOS has some stability problems, but then the developers specifically call it "alpha" software. Despite that, I've put customers on ReactOS systems after verifying all their software ran on it. It gets them off the Microsoft upgrade treadmill. Sometimes there are compatibility problems and I fall back to Wine on Linux. Occasionally nothing will do but real Windows.
Which reduces its innovation level to nothing more than a chest-mounted camera.
You want real B2C products that people would actually buy? Look at the Superbowl ads instead. Then watch the Humane ad again. It's laughable.
https://en.wikipedia.org/wiki/FireChat
People always fail to see something that is an inevitability. Humans lack foresight because they don't like change.
google glass sucks though and glasses will never be a thing. google and meta and … can spend $8T and come up with the most insane tech etc but no one will be wearing f’ing glasses :)
It was an extremely interesting effort where you could tell a huge amount of thought and effort went into making it as privacy-preserving as possible. I’m not convinced it’s a great idea, but it was a substantial improvement over what is in widespread use today and I wanted there to be a reasonable debate on it instead of knee-jerk outrage. But congrats, I guess. All the cloud hosting systems scan what they want anyway, and the one that was actually designed with privacy in mind got screamed out of existence by people who didn’t care to learn the first thing about it.
> I wanted there to be a reasonable debate on it
I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
We need to just keep making it clear the answer is "no", and hopefully strengthen that to "no, and perhaps the massive smoking crater that used to be your political career will serve as a warning to the next person who tries".
I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of?
I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
Once that idea appears, it allows every lobbyist and insider to say “mandate this, we’ll do something like what Apple did but for other types of Bad People” and all of a sudden you have regulations that force messaging systems to make this possible in the name of Freedom.
Remember: if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image. There are many in politics for whom that level of control is the actual goal.
Great!
> the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”
Apple never made that assertion, and the system they designed is incapable of doing that.
> if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image.
Apple’s system cannot do that. If you change parts of it, sure. But the system they proposed cannot.
To reiterate what I said earlier:
> The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
So far, you are saying that you don’t have a problem with the system Apple designed, and you do have a problem with some other design that Apple didn’t propose, that is significantly different in multiple ways.
Also, what do you mean by “model”? When I used the word “model” it was in the context of using another system as a model. You seem to be using it in the AI sense. You know that’s not how it worked, right?
Chat Control, and other proposals that advocate backdooring individual client systems.
Clients should serve the user.
Chat Control is older than Apple’s CSAM scanning and is very different from it.
> Clients should serve the user.
Apple’s system only scanned things that were uploaded to iCloud.
You missed the most important part of my comment:
> I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
> I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
Right, well I wanted a debate. And Apple changed their minds. So how is it reminding you of that? Neither of those things apply here.
No thanks. I'll take a hammer to any device in my vicinity that implements police scanning.
No, but I have a hard time imagining a bug that would meaningfully compromise this kind of system. Can you give an example?
> How about making Apple vulnerable to demands from every government where they do business?
They already are. So are Google, Meta, Microsoft, and all the other giants we all use. And all those other companies are already scanning your stuff. Meta made two million reports in 2024Q4 alone.
I believe my retro Nokia phones s60/s90 does not have any spyware. I believe earlier Nokia models like s40 or monochrome does not even have an ability to spy on me (but RMS considers triangulation as spyware). I don't believe any products from the duopoly without even root access are free from all kinds of vendor's rootkits.
But not very different to how it was actually going to work, as you say:
> If you change parts of it, sure.
Now try to reason your way out of the obvious "parts of it will definitely change" knee-jerk.
Apple designed a system. People guessed at what it did. Their guesses were way off the mark. This poisoned all rational discussion on the topic. If you imagine a system that works differently to Apple’s system, you can complain about that imaginary system all you want, but it won’t be meaningful, it’s just noise.
If you can't acknowledge this, it puts you in a position where you can't be convincing to people who need you to deflect obvious, well-known criticisms before beginning a discussion. It gives you crazy person or salesman vibes. These are arguments that someone with a serious interest in the technology would be aware of already and should be included as a prerequisite to being taken seriously. Doing this shows that you value other people's time and effort.
Brief (CC0): https://doi.org/10.5281/zenodo.17305774 Curious: would this structure have saved any of the projects mentioned here?