De-Bloated Windows 11 Build Runs on 2GB of RAM

(tomshardware.com)

474 points | by smusamashah 418 days ago

43 comments

  • abraxas 418 days ago
    Let’s pause for a bit and dwell on the absurd amount of RAM it takes to run it even after this exercise. Anyone here remember when QNX shipped a demo in 2000 with a kernel, GUI, web browser and an email client on a single 3.5” floppy? The memory footprint was also a few megabytes. I’m not saying we should be staying within some miserly arbitrary constraints, but my goodness something that draws UI and manages processes has not grown in complexity by four orders of magnitude in 20 years.
    • mixedCase 418 days ago
      Hasn't it, though? HDR, fluid animations, monstrous resolutions, 3D everything, accessibility, fancy APIs for easier development allowing for more features, support for large amounts of devices, backwards compatibility, browsers are almost unrecognizable in featureset to the point they resemble an OS unto themselves, email clients have stayed mostly the same at least except for the part that they also ship a browser and few of us even use 'em anymore! Some of those features combine exponentially in complexity and hardware requirements, and some optimizations will trade memory for speed.

      Not going to defend particular implementations, but requirements? Those have definitely grown more than we give them credit.

      • grishka 418 days ago
        > HDR, fluid animations, monstrous resolutions

        That's the job of the GPU driver, mostly.

        > 3D everything

        That's the desktop compositor. Windows 7 already had one and ran on 1 GB of RAM.

        > accessibility

        Not everyone needs it, so it should be an optional installable component for those who do.

        > fancy APIs for easier development allowing for more features

        That still use win32 under the hood. Again, .net has existed for a very long time. MFC has existed for an even longer time.

        > support for large amounts of devices

        No one asked for Windows on touchscreen anything. Microsoft decided that themselves and ruined the UX for the remaining 99% of the users that still use a mouse and a keyboard.

        > backwards compatibility

        That's what Microsoft does historically, nothing new here.

        > browsers are almost unrecognizable in featureset to the point they resemble an OS unto themselves

        No one asked for this. My personal opinion is that everything app-like about browsers needs to be undone, yesterday, and they should again become the hypertext document viewers they were meant to be. Even JS is too much, but I guess it does have to stay.

        • dagmx 418 days ago
          > That's the job of the GPU driver, mostly.

          I think you have to reason this one out. Your statement, to me, doesn’t hold water.

          Let’s start with HDR. That requires the content that’s being rendered to have higher bit depth. Not all of this is stored in GPU memory at once, a lot is stored in system RAM and shuffled in and out.

          Now take fluid animations. The interpolation of positions isn’t done solely on the GPU. It’s coordinated by the CPU. I don’t think this one necessarily adds ram usage but I think your comment is incorrect.

          And lastly with resolutions, the GPU is only responsible for the processing and output. You still need high resolution data going in. This is easily observed by viewing any low resolution image. It will be heavily blurred or pixelated on a high resolution screen. That stands to reason that the OS needs to have high enough resolution assets to accommodate high resolution screens. Now these aren’t all stored on disc necessarily as high resolution graphics but they have to be stored in memory as such.

          ——

          As to the rest of your points, they basically boil down to: I don’t want it so I don’t see why a default install should have it. Other people do want a highly feature full browser that can keep up with the modern web. And given that webviews are a huge part of application rendering today, the browser actively contributes to memory usage.

          • phkahler 417 days ago
            >> Let’s start with HDR. That requires the content that’s being rendered to have higher bit depth. Not all of this is stored in GPU memory at once, a lot is stored in system RAM and shuffled in and out.

            HDR can still fit in 32bit pixels. At 4k X 2k we have 8 megapixels or 32MB frame buffer. With triple buffering that's still under 100MB. Video games have been doing all sorts of animation for decades. It's not a lot of code and a modern CPU can actually composite a desktop in software pretty well. We use the GPU for speed, but that doesn't have to mean more memory.

            The difference between 2000 and 2023 is the quantity of data to move and like I said, that about 100MB

          • doubled112 418 days ago
            I've got 8GB of RAM on my GPU.

            Could we stop shuffling it out? Do more of the work there, directly?

            • dagmx 418 days ago
              Unintuitively, your two questions are somewhat at odds with each other.

              The more work you do on the GPU, the more you need to shuffle because the more GPU memory you’d use AND the more state you’d need to check back on the CPU side, causing sync stalls. It’s not insurmountable, and macOS puts a lot more of its work on the GPU for example. Windows is a little more conservative in that regard.

              Here are some more confounding factors:

              - Every app needs one or more buffers to draw into. Especially with hidpi screens this can eat up memory quick. The compositor can juggle these to try and get some efficiency, but it can’t move all the state to the GPU due to latency.

              - you also need to deal with swap memory. You’d ultimately need to shuffle date back to the system ram and then to disk and back which is fairly slow. It’s much better theoretically on APUs though.

              Theoretically, APUs stand to solve a lot of these issues because they blur the lines of GPU and CPU memory.

            • SSLy 418 days ago
              • dagmx 418 days ago
                Direct storage doesn’t address the majority of these concerns though. It only means the CPU doesn’t need to load data first to shuffle it over, but it doesn’t help if the CPU does need to access said data or schedule it.

                It’s largely applicable mainly to games where resource access is known ahead of time.

                • deadly_syn 418 days ago
                  Wouldnt a desktop interface also be a predefined set of resources? Usually with far less visual flare too.
                  • dagmx 417 days ago
                    Only if you’re dealing with just the desktop environment and don’t allow the user to load applications. Or if those apps also didn’t allow dynamicism of any kind, like loading images from a website
        • Andrex 418 days ago
          > > browsers are almost unrecognizable in featureset to the point they resemble an OS unto themselves

          > No one asked for this. My personal opinion is that everything app-like about browsers needs to be undone, yesterday, and they should again become the hypertext document viewers they were meant to be. Even JS is too much, but I guess it does have to stay.

          People did ask for this, because it made them a lot of money.

          You should recognize your opinion is a minority one outside of tech (and possibly, there too).

          To wit, virtually no one is jumping to Gopher or Gemini.

          • josephg 418 days ago
            > No one asked for this.

            What people want is a way to run amazon.com (and gmail and slack and so on), on any of their devices, securely, and without the fuss of installing anything.

            Ideally the first-time use of amazon.com should involve nothing more than typing "amazon" and hitting enter. It should to show content almost instantly.

            Satisfying that user need doesn't require a web browser. If OS vendors provided a way to do that today, we'd be using it. But they don't.

            OS vendors still don't understand that. They assume people forever want to install software via a package manager. They assume software developers care about their platform's special features enough to bother learning Kotlin / Swift / GTK / C# / whatever. And they assume all software users run should be trusted with all of my local files.

            Why is docker popular? Because it lets you type the name of some software. The software is downloaded from the internet. The software runs on linux/mac/windows. And it runs in a sandbox. Just like the web.

            The web - for all its flaws - is still the only platform which delivers that experience to end users.

            I'd throw out javascript and the DOM and all that rubbish in a heartbeat if we had any better option.

            • bufferoverflow 418 days ago
              > What people want is a way to run amazon.com (and gmail and slack and so on)

              Guess what, both GMail and Slack have video calls. They use WebRTC. The browser has to support it. So the WebRTC code is a part of it.

              > Ideally the first-time use of amazon.com should involve nothing more than typing "amazon" and hitting enter. It should to show content almost instantly.

              And it does. Open an incognito tab, type amazon.com, it's pretty crazy how fast it loads, with all the images.

              • josephg 417 days ago
                > Open an incognito tab, type amazon.com, it's pretty crazy how fast it loads, with all the images.

                Yes; that's my point. That's the bar native apps need to reach to be competitive with the web.

                • tsimionescu 417 days ago
                  You're just proposing to move all the complexity of the browser into some other VM that would have to be shipped by default by all OS platforms before it could become useful.

                  Java tried exactly this, and it never took off in the desktop OS world. It wasn't significantly slimmer than browsers either, so it wouldn't have addressed any of your concerns.

                  Also, hyperlinking deep into and out of apps is still something that would be very very hard to achieve if the apps weren't web native - especially given the need to share data along with the links, but in a way that doesn't break security. I would predict that if you tried to recreate a platform with similar capabilities, you would end up reinventing 90% of web tech (though hopefully with a saner GUI model than the awfulness of HTML+CSS+JS).

                  • josephg 417 days ago
                    > You're just proposing to move all the complexity of the browser into some other VM that would have to be shipped by default by all OS platforms before it could become useful.

                    I'm not proposing that. I didn't propose any solution to this in my comment. For what its worth, I agree with you - another java swing style approach would be a terrible idea. And I have an irrational hate for docker.

                    If I were in solution mode, what I think we need is all the browser features to be added to desktop operating systems. And those features being:

                    - Cross platform apps of some kind

                    - The app should be able to run "directly" from the internet in a lightweight way like web pages do. I shouldn't need to install apps to run them.

                    - Fierce browser tab style sandboxing.

                    If the goal was to compete with the browser, apps would need to use mostly platform-native controls like browsers do. WASM would be my tool of choice at this point, since then people can make apps in any language.

                    Unfortunately, executing this well would probably cost 7-10 figures. And it'd probably need buy in from Apple, Google, Microsoft and maybe GTK and KDE people. (Since we'd want linux, macos, ios, android and windows versions of the UI libraries). Ideally this would all get embedded in the respective operating systems so users don't have to install anything special, otherwise the core appeal would be gone.

                    Who knows if it'll ever happen, or if we'll just be stuck with the web forever. But a man can dream.

                    • tsimionescu 417 days ago
                      My thinking is that, ultimately, if you want to run the same code on Windows, MacOS, and a few popular Linux distros, and to do so on x86 and ARM, you need some kind of VM that translates an intermediate code to the machine code, and that implements a whole ton of system APIs for each platform. Especially if you want access to a GUI, networking, location, 3D graphics, Bluetooth, sound etc. - all of which have virtually no standardization between these platforms.

                      You'll then have to convince Microsoft, Apple, Google, IBM RedHat, Canonical, the Debian project, and a few others, to actually package this VM with their OSs, so that users don't have to manually choose to install it.

                      Then, you need to come up with some system of integrating this with, at a minimum, password managers, SAML and OAuth2, or you'll have something far less usable and secure than an equivalent web app. You'll probably have to integrate it with many more web technologies in fact, as people will eventually want to be able to show some web pages or web-formatted emails inside their apps.

                      So, my prediction is that any such effort will end-up reimplementing the browser, with little to no advantages when all is said and done.

                      Personally, I hate developing any web-like app. The GUI stack in particular is atrocious, with virtually no usable built-in controls, leading to a proliferation of toolkits and frameworks that do half the job and can't talk to each other. I'm hopeful that WASM will eventually allow more mature GUI frameworks to be used in web apps in a cross-platform manner, and we can forget about using a document markup language for designing application UIs. But otherwise, I think the web model is here to stay, and has in fact proven to be the most successful app ecosystem ever tried, by far (especially when counting the numerous iOS and Android apps that are entirely web views).

                      • josephg 417 days ago
                        > You'll then have to convince Microsoft, Apple, Google, IBM RedHat, Canonical, the Debian project, and a few others, to actually package this VM with their OSs, so that users don't have to manually choose to install it.

                        I think this is the easy part. Everyone is already on board with webassembly. The hard part would be coming up with a common api which paves over all the platform idiosyncrasies in a way that feels good and native everywhere, and that developers actually want to use.

                    • yodon 417 days ago
                      > what I think we need is all the browser features to be added to desktop operating systems.

                      I trust you are aware Microsoft did exactly that, and the entire tech world exploded in annger, and the US Government took Microsoft to court to make them undo it on the grounds that integrating browser technology into the OS was a monopolistic activity[0].

                      [0]https://en.m.wikipedia.org/wiki/United_States_v._Microsoft_C....

            • pjerem 417 days ago
              While I agree with you, I don’t think the people really wanted this. I mean, life wasn’t miserable when web apps didn’t existed.

              We could have lived in an alternative universe where we succeeded to teach people the basics of how to use the computer as a powerful tool for themselves.

              Instead, corporations rushed to make most of the things super easy to make billions on the way.

              I’d even say that this wasn’t really a problem until they realized that closed computers allowed them more control and more money.

              So yeah, now we are stuck with web apps on closed systems and most people are happy with it, that’s true.

              And, as the time passes, we are loosing the universal access to "the computer". Instead of a great tool for enabling power to the people, it’s being transformed to a prison to control what the people can do, see and even think.

              ps : When I say "computer" I include PC, phones, tablets, voice assistants … everything with a processor running arbitrary programs.

              • mixermachine 417 days ago
                I disagree. When I want to deliver a piece of software to my parents I first think about a web solution (they are a symbol for me for >80% of PC users). I just uninstalled a browser tool bar from my step father's pc last weekend. There are simply to many bad actors out there. The browser sandbox works pretty well against them. My parents have become very hesitant to install anything, even iOS updates, because they don't like change and fear that they might do something wrong.

                I agree that JS is not a gold standard. Still it works most of the time and with typescript stapled on top it is acceptable.

                Time has proven again and again (not only in tech) that the simple solutions will prevail. Want to change it? Build a simpler and better solution. I don't like that too but that's human nature at work.

          • pdntspa 418 days ago
            I'm so sick of people shutting down valid opinions because they have a "minority opinion" about tech. That tech slobbers so messily over the majority -- and, seemingly, ONLY the majority -- is a massive disservice to all of the nerds and power users that put these people where they are today.

            Maybe, instead of shutting those opinions down, you should reflect on how you, in whatever capacity you serve our awful tech overlords, can work to make these voices more heard and included in software/feature design

            • simlevesque 418 days ago
              I hear you, but OP said 'no one asked for this' but people did ask for this. The whole argument was about popularity of the idea to add features to browsers.
            • Andrex 418 days ago
              If OP had written their comment differently, I would have approached it differently is all I can say. I think my comment history would bear that out.

              Fwiw: https://news.ycombinator.com/item?id=34226798

        • djur 418 days ago
          > Not everyone needs [accessibility], so it should be an additional installable component for those who do.

          The UI has to be designed from the ground up to support accessibility.

          • cratermoon 418 days ago
            I'd also like to add that accessibility is not a binary that's either on or off. Parent comment might be thinking of features for people with high disability ratings, but eventually everyone has some level of disability. Some even start of life with one: color blindness, vision impairment. Most people have progressive near vision loss (presbyopia) as they age.

            Also, disability may not be permanent. I recently underwent major surgery and for at least a few days afterwards using my cell phone was nearly impossible. I resorted voice control a few times because I did not have the coordination or cognitive function to type. (Aside: cell phones in general are accessibility dumpster fires, but it took a major life event to demonstrate to me how bad it really is.)

            So no, accessibility is not just a toggle switch or installable library. In fact, I hope future UI design incorporates some kind of non-intrusive learning and adaptability, such that when the system detects the user continually making certain kinds of errors, the UI will adapt to help.

            • prmoustache 417 days ago
              And I'll had you want it available on the installer already. Having said that you canct explain all that bloat with accessibility.
              • cratermoon 417 days ago
                > you want it available on the installer already

                Of course. Navigating around the install process without accessibility already enabled is going to be a non-starter for many.

                As for why all the bloat? I speculate it's because accessibility features are a second-class citizen at best, and when it comes to optimizing and streamlining, all the effort in development goes into the most-used features, whether or not they are the most essential.

          • abraxas 418 days ago
            Accessibility support was in Windows 95 as far as I remember.
            • KyeRussell 418 days ago
              “Accessibility support” is not a thing.
            • Ar-Curunir 418 days ago
              Are you seriously suggesting that accessibility support in modern OSes is less complex than in Windows 95?
              • Dylan16807 418 days ago
                I'm suggesting the modern accessibility support doesn't need more memory than the entirety of windows 95. So 4MB extra, or let's say 10x that to be generous.
              • hulitu 418 days ago
                Yes. At least in Windows 10 is a disaster. Without high contrast, which looks terrible, it draws gray colors on light background making it difficult to read.
                • anthk 418 days ago
                  Microsoft should copy the Zukitre theme from GTK2/3 as the inspiration. It's flat yet it has some contrast.
          • grishka 418 days ago
            [flagged]
            • dagmx 418 days ago
              Accessibility is much more than just labels for a screen reader. Please stop trivializing anything that you don’t use directly, it’s a common thread between all your comments, and it’s a disservice to both the points you’re trying to make and the people who actually use those things .

              Accessibility includes interaction design, zoom ability, audio commands, action link ups, alternate rendering modes, alternate motion modes, hooks for assistive devices to interact with the system. It goes far deeper into the system than just labels for a screen reader.

              If you stopped to just think about the vast number of disabilities out there, you’d realize how untrue your statement is.

              • userbinator 418 days ago
                All that extra crap doesn't make any sense, when the earliest versions of Windows up to ~7 had controls to let you adjust the UI to exactly how you'd like it, which is of course very important for accessibility.

                Then starting with Windows 8, they removed a lot of those features. 11 is even worse.

              • grishka 418 days ago
                My point is that accessibility being a thing shouldn't ruin the UI for the people who don't need it. There's no need to visually redesign anything to introduce accessibility. Apps don't need to be made aware whether some control has focus because the user has pressed the tab key, or because it's being focused by a screen reader, or because of some other assistive technology. Colors and font sizes can also be configured and they've been configurable since at least Windows 3.1 — and that is exposed to apps.

                Again, I don't see how the things you specified can't be built into existing win32 APIs and why anything needs to be designed from the ground up to support them.

                • dagmx 418 days ago
                  Your point about “apps don’t need to be made aware” is precisely the reason accessibility is part of the system UI framework.

                  Accessibility is also not something that is just a binary. You may be slightly short sighted and need larger text, you might need an OS specified colour palette that overrides the apps rendering. There’s just so many levels of nuance here. It’s not just “apps can configure a palette”, it’s that they need to work across the system

                  If you have the time, I really suggest watching the Apple developer videos on accessibility to see why it’s not just as simple as you put it. Microsoft do a lot of great work for accessibility too , they just don’t have much content up to delve into it.

                  As to why it has to be developed from the ground up, it doesn’t, but it needs to be at the foundation regardless. Apple for example didn’t redo their UI for accessibility, however Microsoft take a more “we won’t touch existing stuff in case we break it” approach to their core libs.

                  Also , again, I’d point out that you’re purposefully trying to trivialize something you don’t use.

                  • grishka 418 days ago
                    > It’s not just “apps can configure a palette”, it’s that they need to work across the system

                    There is a system-provided color palette. I don't know where this UI is in modern Windows, but in versions where you could enable the "classic" theme, you could still configure these colors. They are, of course, exposed to apps, and apps are expected to use them to draw their controls. That, as well as theme elements since XP.

                    > Microsoft take a more “we won’t touch existing stuff in case we break it” approach to their core libs.

                    Making sure you don't break existing functionality is called regression testing. I'm sure Microsoft already does a lot of it for each release.

                    And actually it's not quite that. The transition from 9x to NT involved swapping an entire kernel from underneath apps. Most apps didn't notice it. In fact, the backwards compatibility is maintained so well that I can run apps from the 90s — built for, and only tested on, the old DOS-based Windows versions — on my modern ARM Mac, in a VM, through an x86 -> ARM translation layer.

                • coev 416 days ago
                  > There's no need to visually redesign anything to introduce accessibility.

                  People with motion sickness (reduced animation), the deaf (captions!), and the colorblind would beg to differ

              • KronisLV 418 days ago
                > Accessibility includes interaction design, zoom ability, audio commands, action link ups, alternate rendering modes, alternate motion modes, hooks for assistive devices to interact with the system. It goes far deeper into the system than just labels for a screen reader.

                I wonder where the current status quo lies in regards to both desktop computing and web applications/sites. Which OSes and which GUI frameworks for those are the best or worst, how do they compare? How have they evolved over time? Which web frameworks/libraries give one the best starting point to iterate upon, say, component libraries and how well they integrate with something like React/Angular/Vue?

                Sadly I'm not knowledgeable enough at the moment to answer all of those in detail myself, but there are at least some tools for web development.

                  For example, this seems to have helpful output: https://accessibilitytest.org 
                  There was also this one, albeit a bit more limited: https://www.accessibilitychecker.org
                  I also found this, but it seemed straight up broken because it couldn't reach my site: https://wave.webaim.org/
                  From what I can tell, there are many tools like this: https://www.w3.org/WAI/ER/tools/
                
                And yet, while we talk about accessibility occasionally, we don't talk about how good of a starting point certain component frameworks (e.g. Bootstrap vs PrimeFaces/PrimeNG/PrimeVue, Ant Design, ...) provide us with, or how easy it is to setup build toolchains for automated testing and reporting of warnings.

                As for OS related things, I guess seeing how well Qt, GTK and other solutions support the OS functionality and what that functionality even is is probably a whole topic in of itself.

                • extra88 418 days ago
                  > it seemed straight up broken because it couldn't reach my site: https://wave.webaim.org/

                  It worked for me, it found lots of color contrast problems (white-on-light purple has low contrast). https://wave.webaim.org/report#/https://kronis.dev/

                  WAVE is also available as a browser extension.

                  Accessibility checkers can be helpful, particularly for catching basic errors before they ship. The large majority of accessibility problems a site can have cannot be identified by software, humans need to find them.

                  Current Bootstrap is not bad if you read and follow all of their advice. I'm not claiming there are no problems lurking amongst their offerings.

                  If you search for "name-of-thing accessibility" and don't find extensive details about accessibility in the thing's own documentation, it probably does a poor job. A framework can't prevent developers from making mistakes.

                  • wizofaus 418 days ago
                    "The large majority of accessibility problems a site can have cannot be identified by software"

                    Bold statement. I used to work in exactly that area and the reality is humans often simply don't bother finding many of the accessibility issues that automated tools can and do find. Even if such a tool isn't able to accurately pinpoint every possible issue, and inevitably gives a number of false positives (the classic being expecting everything to have ALT text, even when images are essentially decorative and don't provide information to the user), the use of it at least provides a starting point for humans to be able to realistically find the most serious issues and ensure they're addressed.

                    However I would never claim that good accessibility support requires significantly more (e.g. >2x) resources, and certainly not at the OS level. In fact, you typically get better accessibility if you use the built-in OS (or browser) provided controls, which are less resource intensive than the fancy custom ones app seems to like using these days (even MS's own apps are heavy on custom-controls for everything).

                    • extra88 417 days ago
                      I currently work in this area (web accessibility) and am just repeating what is commonly understood. When considering what WCAG criteria cover (which is not even everything that could pose a barrier to people with disabilities), most failures to meet the criteria cannot be identified by software alone.

                      For example, the classic I would say is not whether an image needs an alt attribute or not but whether an image's alt attribute value is a meaningful equivalent to the image in the context where it appears.

                      I'm not sure what kind of "resources" you're referring to. If you mean computing resources (CPU, RAM, etc.) standard, contemporary computers do seem to have enough for current assistive technologies, one doesn't need to buy a higher end computer to run them. If you mean OS resources for supplying assistive technologies and accessibility APIs, mainstream OS's are decent but specifically for screen readers there's a lot of room for improvement.

                • cratermoon 418 days ago
                  Thanks for those links. I found a few minor items in my website that I was able to easily address.
                • dagmx 418 days ago
                  > Which OSes and which GUI frameworks for those are the best or worst, how do they compare?

                  Hands down macOS/iOS are the leaders here with Cocoa/SwiftUI/UIKit etc (ultimately basically the same). The OS also has many hooks to allow third party frameworks to tie in to the accessibility.

                  Windows is second in my opinion. Microsoft does some good work here but it’s not as extensive in terms of integrations and pervasiveness due to how varied their ecosystem is now. They do however do excellent work on the gaming side with their accessibility controllers.

                  In terms of UI frameworks, Qt is decent but not great . Electron actually does well here because it can piggy back off the work done for web browsers. Stuff like Imgui etc rank at the bottom because they don’t expose the tree to the OS in a meaningful way.

                  I can’t speak to web frameworks. In theory it shouldn’t matter as long as the components are good. Many node frameworks try and install a11y as a package to encourage better accessibility.

                  • JoBrad 418 days ago
                    I switched from windows to macOS, which I’ve been using as my daily driver for the last year or so. Using the touchpad (or maybe the Magic Mouse) is basically a requirement to use “vanilla” macOS. Yes, you can install additional programs to help with window management, etc., but in my experience macOS is absolutely horrible when it comes to accessibility, from this standpoint. Maybe it’s better for colors, TTS, etc.?
                    • dagmx 418 days ago
                      I’m not sure what walls you might have been hitting but macOS is completely useable with speech direction. I had to quite recently add better accessibility support to an app I worked on and I was basically navigating the entire system with voice control and keyboard hotkeys.

                      Voice control in particular is really handy with the number and grid overlays for providing commands.

                    • extra88 418 days ago
                      Did you enable Full Keyboard Access and learn how to use it?
                      • JoBrad 418 days ago
                        I’ll check it out. But this seems to approach accessibility as a feature to be turned on or off. Most of what it enables, based on Apple docs, is not just enabled in Windows and many Linux window managers I’ve used, but it’s something that developers actively utilize.
                        • extra88 417 days ago
                          That's not where macOS came from. For Windows and Linux, "in the beginning was the command line" but not for Macs.

                          There's plenty one can do in macOS and its native applications with a keyboard by default, those that need more can enable "Use keyboard navigation to move focus between controls." Those that need even more enable Full Keyboard Access. These settings aren't on be default because Apple has decided they'd just get in the way and/or confuse people who use the keyboard but rely on it less.

                          In Safari specifically, by default pressing Tab doesn't focus links as it does in every other browser because most people use a cursor to activate links, not the keyboard. There also tend to be a lot more links than what Tab does focus, form inputs.

                          Macs try to have just enough accessibility features enabled by default that anyone who needs more can get to the setting to turn it on. Something I just learned Macs have that other OS/hardware doesn't is audible feedback for the blind to login when a Mac is turned on while full disk encryption is enabled.

                          I'm not claiming Apple gets everything right or that their approach is the best, I'm just trying to describe the basics of what's there and the outlook driving the choices.

                  • hulitu 418 days ago
                    > Windows, Electron

                    Gray on gray, Teams. Accessible like a hammer: everything looks like nails.

                    • dagmx 418 days ago
                      Irrelevant. You can make bad apps in any framework and bad accessibility choices as well. That’s not a reflection of the framework or tool itself
            • rhdunn 418 days ago
              Keyboard shortcuts and navigation are accessibility.

              Dark/light mode is accessibility.

              Reduced animations/not animating tiles, etc. is accessibility.

              Being able to scale/zoom in on fonts and images is accessibility.

              Ensuring your automated GUI tests can interrogate the application/page's structure and state is accessibility.

              Not reloading the entire page to render search results that would lose searh filter selection and/or current keyboard focus is accessibility.

              etc.

        • jeroenhd 418 days ago
          I want touchscreen support on Windows. But guess what? Multitouch worked in Windows 7. If Windows still supported theming basic controls then Microsoft could enable touch screen support in most applications by setting a theme, similar to how they enhance contrast if you enable that feature.

          I understand that bigger stuff and better graphics involve more RAM and the switch to 64 bit doubled the pointer sizes (which is why you can't meaningfully run Windows 7 x64 on 1GB of RAM like you can the 32 bit version) but with 4GB of system RAM you should be able to fit everything in and then some.

          You actually can, as various Linux distributions demonstrate. The algorithms and APIs aren't as well developed, but better window control/accessibility APIs don't take up more than a megabyte of RAM.

          People do ask for many Microsoft features, such as the appification of the interface and the Microsoft store. Just because you didn't ask for it, doesn't mean it's not necessary. However, Microsoft has known for years how to build and implement those requests in a much more compact environment.

          My take is still the same old cynical one: as resources become cheaper, developers become lazier. I don't want to go back to the days of racing the beam with carefully planned instructions but the moment Electron gained any popularity the ecosystem went too far. "Yes but our customers want features more than a small footprint" is the common excuses I hear, but that's ignoring all the people calling various support channels or just being miserable with their terribly slow machine.

          • Jochim 418 days ago
            > as resources become cheaper, developers become lazier.

            At most places I've worked it's a struggle to get time allocated towards necessary refactoring that'll ensure new features can be delivered in a timely fashion.

            I'd love to spend time making the product more efficient but unless I can demonstrate immediate and tangible business value in doing so, it's never going to be approved over working on new features.

          • intelVISA 418 days ago
            Few companies want to pay the exuberant costs for (good) native development given the ratio of devs working at the Js level vs. below.
        • nateb2022 418 days ago
          >No one asked for Windows on touchscreen anything. Microsoft decided that themselves and ruined the UX for the remaining 99% of the users that still use a mouse and a keyboard.

          I have several devices, including a couple Linux PC's, an M1 macbook air, and a Microsoft Surface Go. If Windows 11 didn't support touchscreens, I would have gone with an iPad. However, Windows 11 is the _best_ touchscreen OS to-date.

          Unlike iOS or iPadOS, Windows 11 runs desktop apps and combines the convenience of touchscreen scrolling/interaction with the desktop experience. Windows 11 does this very, very well.

          • Andrex 418 days ago
            I'm curious if you've used Chrome OS recently, there's a lot of good work there too. Touch is there if you need it with the keyboard open, then goes into tablet mode if the laptop is convertible or detachable. The touch/tablet UI has lost many rough edges in the last 2-3 years, and it hasn't affected the mouse/keyboard mode most people use Chromebooks for.

            I don't use Windows anymore but I remember thinking "this is exactly what I've always wanted from a convertible/touch-support-in-desktop OS"...

        • anthk 418 days ago
          >That's the desktop compositor. Windows 7 already had one and ran on 1 GB of RAM.

          Compiz ran fine with a 128MB GPU and 512MB of RAM.

          • NovaPenguin 418 days ago
            I think I first saw it running on a Geforce with 64MB of RAM. Even then it was smooth as butter.

            Now that I think about it, Mac OSX was doing GPU compositing back in 2000/2001 and those machine usually only had about 16MB of VRAM. I remember it running fairly well on a 2005 MacMini G4 with 32MB of VRAM.

            • duskwuff 418 days ago
              The first versions of Mac OS X only supported software rendering. GPU compositing didn't show up until 2003, in Mac OS X 10.2. It was branded as "Quartz Extreme".
              • NovaPenguin 416 days ago
                I did not know that! There was about a 6-7 year gap between 1997-2004 where I didn't really do much with Mac's. But your timeline seems spot on, it was 10.3 when they introduced Expose into the system. A great demonstration of the GPU functionality in action.
          • grishka 418 days ago
            Actually, IIRC the only requirement for DWM to work was a GPU that supports shaders, because that's what makes the window border translucency/blur effect possible.
            • justsomehnguy 418 days ago
              Compatible driver, actually. There were at least DWM 1.0 (Vista) and DWM 1.2 (Win7), but Intel never provided a compatible driver for... 915? Series, so you could't enable composition on them, despite hardware were capable enough.
        • hedora 418 days ago
          Prodigy had vector based graphics in a terminal back in the 1980’s. Granted, that targeted EGA and 2400 baud modems, but I wonder how well it would work on modern hardware if you just gave it a 4k, 24bit frame buffer, and fixed up the inevitable integer overflows.
          • whartung 418 days ago
            Actually, I've run Citrix (ancestor of Remote Desktop) on a 14.4k modem. Once all the bitmaps are downloaded and cached (those app launch 1/2 screen splash pages were murder), it ran pretty well. The meta graphic operations (lines, circles, fills, etc.), fonts, etc. worked fine. Any large pixmap operations were crushing, but most productivity apps didn't use those as much as you'd think.
        • inglor_cz 418 days ago
          "No one asked for this. My personal opinion"

          You didn't ask. It is, as you say, your personal opinion.

          From my POV, current Web is fine and the fact that browsers are powerful liberated us from writing specialized desktop apps for various OSes. I am much happier writing a Web UI than hacking together Win32 or Qt-based apps. Or, God forbid, AVKON Symbian OS UI. That was its own circle of hell.

          • grishka 418 days ago
            > liberated us from writing specialized desktop apps for various OSes

            I use macOS and I very much dislike anything built with cross-platform GUI toolkits, and especially the web stack. And it's always painfully obvious when something is not native. It doesn't behave like the rest of the system. It's not mac-like. It draws its own buttons from scratch and does its own event handling on them instead of using NSButton. I don't want that kind of "liberation". I want proper, native, consistent apps. Most other people probably do too, they just don't realize that or can't put it into words.

            The only counter-example out there known to me is IntelliJ-based IDEs. They're built with Swing, but they do somehow feel native enough.

            Also, developer experience is not a something users care about. And I'm saying that as a developer myself. Do use fancy tools to make your job easier, sure, but avoid those of them that stay inside your product when you ship it.

            • bee_rider 418 days ago
              I don’t like the direction GUIs have gone either, and think the JavaScript-ization of everything has been pretty dumb. But it seems that bloat is doing well in the market.

              Users might not care about developer experience, but everything is a trade off: developer time is a cost, the cost of producing software is an input into how much it needs to cost. Users seem to want features delivered quickly, without much regard to implementation quality.

              • grishka 418 days ago
                Users just don't have much say in the matter. Case in point: Discord and Slack are atrocious UX-wise. You're still forced to use them because, as with any walled-garden communication service, you aren't the one making this choice.
                • ahtihn 417 days ago
                  Discord UX is better than anything there was before, which is why it replaced all the other software people used before in gaming communities.
                • bee_rider 417 days ago
                  IM-ing programs seem more like the exception, rather than the rule.
            • matthew-wegner 418 days ago
              Hold up. It's been ~14 years since Apple shipped machines with 2GB of memory as base their base model.

              macOS (and iOS) have incredibly good screen reader support, as well as all of the things you're complaining about in your original comment at the top of this thread. Clearly those things are absolutely gobbling memory, and yet you don't seem to connect the dots that they're directly contributing to high memory requirements of macOS?

              I mean, 8GB on stock machines today is barely manageable. You can't buy a Mac with less than 8GB today; you can't even buy a phone with 2GB or less. I'm not sure you're in an position to rail against high-memory bloat in computing today.

              p.s. I say this as someone who uses macOS as their daily driver and has for a very long time

              • Dylan16807 418 days ago
                > I'm not sure you're in an position to rail against high-memory bloat in computing today.

                Nobody is a hypocrite for buying X gigabytes of ram but also wanting the naked operating system to use a much smaller amount, or wanting single programs to use a much smaller amount.

                > macOS (and iOS) have incredibly good screen reader support, as well as all of the things you're complaining about in your original comment at the top of this thread. Clearly those things are absolutely gobbling memory, and yet you don't seem to connect the dots that they're directly contributing to high memory requirements of macOS?

                What makes a screen reader gobble memory?

                And it definitely shouldn't gobble memory when it's not running.

                • devinprater 417 days ago
                  Mainly the TTS engine being ready for input, stuff like that. Of course you could go to Linux where you have to enable assistive technologies support before the whole desktop understands that they should work with screen readers. I'm guessing that there is where accessibility does take up RAM and resources.
              • grishka 418 days ago
                Screen reader support by itself doesn't gobble memory. Android has had it for ages, and still runs on devices with less than 1 GB RAM (Android Wear watches).

                Running several instances of Chromium though... You'll probably run one anyway at all times as your actual web browser, but additional ones in the form of "oh so easy to build" Electron apps don't help. In Apple's eyes, though, you should absolutely ignore other browsers and use Safari exclusively. It might not be as much of a memory hog as Chrome — I haven't researched this, this is simply my guesses.

                I also heard that M1 Macs are better at memory management compared to Intel. Again, I don't have any concrete evidence to back this up, but knowing Apple, it's believable.

          • blacklion 418 days ago
            It liberated you as developer. As developer, I could understand. As user I hate you. You never provide me as user with native experience via web UI. You use custom controls, which broke conventions of native controls a little bit there and here. You can not use full power of OS (YouTube or Spotify player doesn't pause itself when workstation is locked, my native player of choice does). You eat my resources. You cannot make your application consistent with application from other vendor, so I need to remember different pattern for different apps. Your typical browser app doesn't have ANY features for power users, like shortcuts for all commands and useful keyboard controls (not to mention full customization of these controls, toolbars, etc). Damn you and your laziness!

            But I understand, that most of my complains are complains of power user with 25+ years of experience and muscle memory, and I'm not target auditory for almost any new app. You win :-(

            • inglor_cz 417 days ago
              Everything is a trade-off. If, as a developer, you have to spend ungodly hours on learning multiple UIs, you will have less time left for the actual business logic of your app. Which, from the user's side, means one of the following three:

              a) nice looking, but less capable apps,

              b) more expensive apps, or, apps that have to be paid even if they could be free in an alternate universe,

              c) limited availability - app X only exists for Windows and not Mac, because either a Mac programmer isn't available or would be too expensive.

              Developing for multiple UIs at once is both prone to errors and more expensive, you wind up paying for extra developers, extra testers/QA, extra hardware and possibly extra IDEs and various fees. Such extra cost may be negligible for Google, but is absolutely a factor for small software houses outside the richest countries, much more so for "one person shows" and various underfunded OSS projects.

              I remember the hell that was Nokia Series 60 and 90 programming. Nokia churned out a deluge of devices that theoretically shared the same OS, but they had so many device-specific quirks and oddities on the UI level that you spent most of the time fighting with (bad) emulators of devices you could not afford to buy. This is the other extreme and I am happy that it seems to be gone forever.

              • blacklion 416 days ago
                If your application can be useful on different OSes (and now there are only 3 OSes in existence, as porting Desktop application on Mobile requires completely different UI and UX no matter what technology you use!), break it into business logic and UI and find partner or hire developer who love to develop native UIs for other OS. MVC pattern is old and well known (though not fashionable now, I understand).

                OSS projects are completely different story, of course, no questions to OSS developers.

                I prefer to pay $200 for native application than $100 for Electron one.

                Oh, whom do I try to fool? Of course, it will be Electron app with $9.95/month subscription now :-(

                • inglor_cz 416 days ago
                  "break it into business logic and UI and find partner or hire developer who love to develop native UIs for other OS"

                  As I said in my previous comment, this is quite expensive, and people inside Silicon Valley rarely understand how cash-strapped the software sector in the rest of the world is. In Czech, we have a saying "a person who is fed won't believe a hungry one" and SV veterans that are used to reams of VC cash supporting even lossy businesses like Uber have no idea that the excess spending needed to hire another developer for several months somewhere in Warsaw or Bucharest may kill a fledgling or small company.

                  In this, the unity of the Web is a life-saver.

                  • blacklion 416 days ago
                    I'm not in Silicon Valley or even USA, and never seen any venture money or VC.

                    But, again, I'm prefer to make one thing good than two things good enough.

        • devinprater 417 days ago
          An optional installable component until you have a blind person doing tech support and they have to walk the tech illiterate person through installing the accessibility stack lol. Oh until you suddenly go blind from a condision or accident and have to mouse your way through the interface, blind, to install that component. Ugh ableism.
        • user3939382 417 days ago
          > they should again become the hypertext document viewers they were meant to be

          There is a small resurgence of the gopher protocol that I believe is rooted in this sentiment.

        • worewood 418 days ago
          I wish I could upvote you a thousand times. This this and this a 100%. It's sad today's "developers" will look at this and just say "ok, boomer"...
          • pixl97 418 days ago
            Being someone from back in those days they'll tell you to load up that software that fits in some small amount of memory. You'll find most of it is crash filled hot garbage missing the features you need. And the moment you wanted to add new features you'd start importing libraries bloating the size of the application.
            • sicp-enjoyer 418 days ago
              So you would say programs today are equally efficient and more stable, just have more features?
              • pixl97 418 days ago
                In generally I would say far more stable and far more features.

                But this of course is in the metrics of how you measure. Windows 3.1 for example was a huge crashing piece of crap that was locking up all the damned time. MacOS at the time wasn't that much better. Now I can leave windows up for a month at a time between security reboots. Specialized Windows and Linux machines in server environments on a reduced patching schedule will stay up far longer, but generally security updates are what limits the uptime.

                I remember running Windows applications and receiving buffer overflow errors back then. If you got a buffer overflow message today you'd think that either your hardware is going bad or someone wrote a terrible security flaw into your application. And back there were security flaws everywhere. 'Smashing the stack for fun and profit' wasn't wrote till '95, well after consumers had started getting on the internet in mass. And if you were using applications like Word or Excel you could expect to measure 'crashes' per week rather than the crashes per month, many of which are completely recoverable in applications like office.

                • sicp-enjoyer 418 days ago
                  Did windows 3.1 even have memory separation? So a bad program could crash the whole system?
              • ElectricalUnion 418 days ago
                Windows 95 can't even stay up for more that 49.7 days before crashing all by itself.
                • userbinator 418 days ago
                  Neither can Windows 10 on default settings, for more than around a month, before force-rebooting all by itself and losing all your work.
                  • CoolCold 417 days ago
                    I'm on Win11 for 1.5 year or so (Win11 Insider Beta channel) and before was on Win10 Beta/Dev channels - so what I remember so far, I was warned multiple times, suggested to pick a time and only after user (me) not shown any cooperation, system was forcibly rebooted, which for consumer grade (I have Pro version) edition is fine, from my PoV. I don't want [my] system and systems around me be a part of botnets like Linux boxes of all sorts.
                  • pixl97 417 days ago
                    For many applications Windows 10 saves state and comes back right where you started on a security update reboot.
                    • ElectricalUnion 416 days ago
                      > For many applications Windows 10 saves state and comes back right where you started on a security update reboot.

                      This needs application support, by this broad definition all operating systems "saves state and comes back right where you started on a security update reboot".

                  • prmoustache 417 days ago
                    Nobody loses his work on a reboot in 2023.
      • jeroenhd 418 days ago
        Resolutions and HDR are one area where I think the extra RAM load and increasing application sizes make complete sense. However, my monitors run at 1080p, don't do HDR, and my video files are rncoded at a standard colour depth. Despite all this, the standalone RAM usage has increased over the years.

        Accessibility has actually gone down with the switch to web applications. Microsoft had an excellent accessibility framework with subpar but usable tooling built in, and excellent commercial applications to make use of the existing API, all the way back in Windows XP. Backwards compatibility hacks such as loading old memory manager behaviour and allocating extra buffer space for known buggy applications may take more RAM but don't increase any requirements.

        Inagree that requirements have grown but not by the amount reflected in standby CPU and memory use. Don't forget that we've also gained near universal SSD availability, negating the need for RAM caches in many circumstances. And that's just ignoring the advance in CPU and GPU performance since the Windows XP days, when DOS was finally killed off and the amount of necessary custom tailored assembly drastically dropped.

        When I boot a Windows XP machine, the only thing I can say I'm really missing as a user is application support. Alright, the Windows XP kernel was incredibly insecure, so let's upgrade to Windows 7 where the painful Vista driver days are behind us and the kernel has been reshaped to put a huge amount of vulnerable code in userspace. What am I missing now? Touchscreen and pen support works, 4k resolutions and higher are supported perfectly fine, almost all modern games still run.

        The Steam hardware survey says it all. The largest target audience using their computer components the most runs one or two 1080p monitors, has 6 CPU cores and about 8GB of RAM. Your average consumer doesn't need or use all of that. HiDPI and HDR are a niche and designing your OS around a niche is stupid.

        • michaelmrose 418 days ago
          SSD access times are as much as 100,000 ns memory 50 ms SSD do not negate the virtue of cache.
          • jeroenhd 417 days ago
            True, but with those access times you can wait a lot longer for content to be loaded into RAM. Hard drives are the reason for many years games needed to duplicate their assets, for example, because seek times slowed down loading time and putting the same content in the file twice but at the right place would speed up the loading process significantly. Games today still have special HDD code because of the difference in performance class.

            SSDs won't replace RAM but many RAM caches aren't performance critical; sometimes you need your code to be reasonably fast on a laptop with a 5400 rpm hard drive and then you have very little choice of data structures. With the random access patterns SSDs allow this complication quickly disappears. You won't find many Android apps that will cache 8MB block reads to compensate for a spinning hard drive, for example.

      • flohofwoe 418 days ago
        > fancy APIs for easier development allowing for more features

        Windows didn't really see a lot of actual progress in this area since the Win2k days. Lots of activity and churn yes, but little actual progress.

        • _Algernon_ 418 days ago
          Further abstractions on top of windows have though: websites, electron apps, etc.
      • yobbo 418 days ago
        > HDR, fluid animations, monstrous resolutions, 3D everything

        May I remind of https://www.enlightenment.org/

        20 years ago, there were "live cds" that could do most of what you mention, at maybe 512 MB ram.

        • mixedCase 418 days ago
          I ran e16 and then e17 as my main desktop back in the day for a good while. I'm sorry but what we had back then was nowhere even near what I'm talking about.

          It definitely was pretty for the day, though.

          • prmoustache 417 days ago
            What do we have today that we didn't have back them in term of bare desktop support?

            I mean we have larger resolution support amd scaling for hidpi, better/faster indexation, better touchpad support. Can you name anything else? Localization hasn't progressed that much, I remember already being able to select some barely spoken dialects on linux 20y ago?

          • anthk 418 days ago
            Then, OSX. A G4 with 512MB of RAM could perfectly do that at 1280x800.
            • whartung 418 days ago
              NeXTSTEP 3.1 ran fine at 1152x832 4 shade mono with 20MB of RAM. 32MB if you were running color.

              It was also rendering Display PostScript on a 25Mhz '040. One of the first machines in its day that allowed you to drag full windows, rather than frames on the desktop. High tech in action!

              • talideon 417 days ago
                You could also do that in '92-ish on RISC OS 3 running on a 1MB Acorn Archimedes with 12MHz ARM2 processor, with high quality font antialiasing. Those were the days!
      • ilyt 417 days ago
        > Hasn't it, though? HDR, fluid animations, monstrous resolutions, 3D everything, accessibility, fancy APIs for easier development allowing for more features, support for large amounts of devices, backwards compatibility,

        Soo the feature windows 7 had? I remember running 3D desktop with compositor and fancy effects on 1GB RAM laptop on Linux...

        RAM requirements for Windows as OS are ridiculus.

      • wslh 417 days ago
        Please don't miss the malware within the OS itself: license services for software such as Microsoft Office and Adobe, and other applications without enough resource bounds.
      • znpy 418 days ago
        to be honest i'd give up most of today's niceties to get a snappier computer experience.

        and to be honest, nowadays the biggest issue is the web browser and the sheer amount of memory and processing that modern websites use.

        it's unbelievable.

        • mixedCase 418 days ago
          It is still possible to have a snappy computer experience. Go Linux, use a very configurable distro (Arch, Gentoo, NixOS), choose a lightweight DE and app ecosystem and it will get you there for the most part.

          Browsers are still going to be the sticking point, but with agressive adblockers/noscript and hardware that's not terribly old (NVMe storage is priority 1), and you should be set.

          But of course, snappiness isn't free and you have to spend some time doing first time set-ups and maintenance.

          • znpy 418 days ago
            I’m on debian and using xfce.

            The problem is the web browser.

            I’ve got 16 gb of ram and the browser is using most of them. I can literally see the swap space emptying when i have (as in “im forced to”) sacrifice my browsing session (xkill the browser) due to constant swap out to disk.

            And I’m using a pci gen 3 nvme disk, and already lowered swappiness.

            The problem is the web browser.

            • jerf 418 days ago
              Do you have an ad blocker, as mixedCase suggests?

              At this point, my primary use case for ad blocking isn't the ad blocking itself, it is 1. the security of blocking ads, one of the worst vectors for attacks in the while and 2. the greatly reduced system resources my browser uses. The ad blocking itself is a further bonus.

            • adrianN 417 days ago
              I have 32GB RAM and Firefox is currently using 2.5% of that. I use ublock, noscript and Auto Tab Discard.
            • opan 418 days ago
              In my experience, changing swappiness makes everything worse and you end up back with the default value in the end.
            • enticingturtle 418 days ago
              Are you actually seeing degraded performance, or do you just have an aesthetic dislike of swapping?
              • znpy 417 days ago
                I’m seeing literal freezes (can’t do anything, even mouse pointer is frozen). But if I kill the browser everything becomes snappy again.
                • mixedCase 417 days ago
                  I'd suggest again to try NoScript/Adblocking, disable hardware accel if you have it enabled, enable it if disabled.

                  If even there you have no success, I'd suggest you try something like EndeavorOS. Browsers have issues but that is not normal. You're not using Debian stable on the desktop, right?

      • pigsty 417 days ago
        Installing command line tools on my Mac through homebrew takes up hundreds of megabytes each time I download anything.

        I know that it installs various libraries. I do not know why those libraries are dozens of megabytes each.

      • agumonkey 418 days ago
        more features but not sure features i want or need, nor in fun interactions

        that said 2GB is acceptable considering the state of everything

        not saying i wouldn't like to have QNX class back

    • TacticalCoder 418 days ago
      > Let’s pause for a bit and dwell on the absurd amount of RAM it takes to run it even after this exercise.

      I agree and I find the apologists to be completely wrong. I run a modern system: 38" screen, 2 Gbit/s fiber to the home. I'm not "stuck in the past" with a 17" screen or something.

      The thing flies. It's screaming fast as it should be.

      But I run a lean Debian Linux system, with a minimal window manager. It's definitely less bloated than Ubuntu and compared to Windows, well: there's no comparison possible.

      Every single keystroke has an effect instantly. After reading the article about keyboard latency, I found out my keyboard was one of the lower latency one (HHKB) and yet I finetuned the Linux kernel for USB 2.0 polling of keyboard inputs to be even faster. ATM I cannot run a real-time kernel because NVidia refuses to modify a non-stock kernel (well that's what the driver says at least) but even without that: everything feels and actually is insanely fast.

      I've got a dozen virtual workspace / virtual desktops and there are shortcuts assigned to each of them. I can fill every virtual virtual desktop with apps and windows and then switch like a madman on my keyboard between each of them: the system doesn't break a sweat.

      I can display all the pictures on my NVME SSD in full screen and leave my finger on the arrow key and they'll move so quickly I can't follow.

      Computers became very fast and monitor size / file sizes for a regular usage simply didn't grow anywhere near close as quickly as CPU performances.

      Windows is a pig.

      • pxc 418 days ago
        I love this comment for getting at what, in my opinion, Linux on the desktop is all about: spending your time with a computer that just plain feels great to use.

        It doesn't look the same for everyone, of course. It's not about some universalizable value like minimalism. But this is a great example of one of the dimensions in which a Linux desktop can just feel really great in an almost physical way.

      • yeuxardents 418 days ago
        Would you mind expounding upon what you have done to achieve 'leanness' in this Debian system? Thanks!
        • snickerer 417 days ago
          I think the fast feeling and the low ram footprint comes mostly from the choice of the window manager. Just use XFCE and you'll be fine.
      • zozbot234 418 days ago
        The low-end requirements for Debian GNU/Linux (assuming a graphical install and an up-to-date version) are not that low. They're higher than the low-end for Windows XP when it first came out, and probably close to the official requirements for "Vista-capable" machines. So yes, it's a very efficient system by modern standards but it does come with some very real overhead nevertheless.
        • anthk 418 days ago
          VIsta capable wasn't that capable. It required 1GB of RAM to run well. Debian with ZRAM and light DE could run with 512MB of RAM and Seamonkey + UBlock Origin with patience.
      • gigel82 418 days ago
        So, how much RAM does it take? You let that bit out.
        • vbezhenar 418 days ago
          If they spend money for NVMe SSD, 38" display and 2 Gbps fiber. My bet is 128GB.
          • shortstuffsushi 418 days ago
            Could you explain why any of the things he says make you think a number that high? I'm just finishing building my first PC ever (I've used computers for ... 20 years? But never actually built one). And I have a 1TB NVMe SSD from Western Digital, it was about 60 bucks. I have a 35" BenQ monitor from work, I think it was around $600 at the time of purchase. I don't have fiber at my home, but from what I understand, it's not prohibitively expensive in general. Anyway - I went with 16gb RAM. That felt like a reasonable starting point considering my current and prior daily driver were there as well. My build (minus admittedly expensive monitor) was, to me compared to the Macbooks I usually have for work, a fairly modest $1250 or so. So, roughly the same specs - seems like nothing too crazy?
            • flask_manager 418 days ago
              Likely the fiber setting expectations, 2gbps is the "premium" tier in many places, where the monthly difference between fast and the top speed is about the same as 32gb of ram.
          • colinsane 417 days ago
            i’ll bet they went for less than max RAM capacity, even if they could afford it:

            - most motherboards reduce the DDR clock when using > 2 sticks.

            - higher capacity RAM sticks use more “ranks” (AKA “banks”), which increases latency.

            as of 2 years ago, 2x single rank DDR would limit you to 64GB. but 2 years is a long time in computerland: 64GB single rank sticks sound plausible.

      • DavideNL 417 days ago
        What would you say is the best Desktop environment for Debian for an average user (not Development), modern PC, and some gaming (nvidia 3080Ti)?
        • KronisLV 417 days ago
          Personally, XFCE is pretty lightweight, customizable and stable. I actually did a blog post where I ran Linux Mint (based on Ubuntu) with XFCE, so you can get a rough idea of it in some screenshots: https://blog.kronis.dev/articles/a-week-of-linux-instead-of-...

          It's not particularly interesting or pretty, but it works well and does most if not everything that you might need, so is my choice for a daily driver. Here's the debian Wiki page on it: https://wiki.debian.org/Xfce

          Apart from that, some folks also like Cinnamon, MATE, GNOME or even KDE. I think the best option is to play around in Live CDs with them and see which feel the best for your individual needs and taste. Do note that Ubuntu as a base distro might give you fewer hassles in regards to proprietary drivers, if you don't care about using only free software much.

          • DavideNL 416 days ago
            Thank you for the info;

            I was already leaning towards XFCE so i will give that a try.

            Also i did some reading on the proprietary drivers (nvidia, etc.) I'm going to install dual boot Debian/XFCE and Pop!_OS for the gaming.

            I still can't believe that Windows has turned into such a bloatware/mess that i'm actually at a point i can't live with it anymore...

            • KronisLV 416 days ago
              > I still can't believe that Windows has turned into such a bloatware/mess that i'm actually at a point i can't live with it anymore...

              That is quite unfortunate, especially because there is some software that I think Windows does better - like MobaXTerm or 7-Zip (with its GUI), FancyZones (for window snapping) and most of the GPU control panels.

              That said, as that article of mine shows, Linux on the desktop is actually way better than it used to be years ago and gaming is definitely viable, even if not all of the titles are supported. Sadly, I don't think that'll happen anytime soon, but it's still better than nothing!

              I'll still probably go the dual boot route with Windows and Linux, or maybe will have a VM with GPU passthrough for specific games on Linux, although I haven't gotten it working just right, ever. Oh well, here's to a brighter future!

      • taskforcegemini 416 days ago
        38" doesn't say anything though, what matters is the resolution. smaller screens at same resolution gets you more ppi.
    • 5e92cb50239222b 418 days ago
      Well, other operating systems are still relatively decent at this. My main Linux install eats ~250 MiB of RAM after startup, and I've spent exactly zero amount of time on that, so it can be trimmed down further. That's on a system with 32 GiB of RAM — if you have less RAM, it will eat even less since page tables and various kernel buffers will be smaller.

      FreeBSD can be comfortably used on systems with 64 MiB of RAM for solving simple tasks like a small proxy server. It has always been good at this — back in the day cheap VPS often used it (and not Linux) precisely because of its small memory requirements.

      • anthk 418 days ago
        OpenWRT Linux with Musl ran fine under 48MB of RAM, 7-8MB running. Modern GNU/Linux, I mean.
        • avx56 418 days ago
          No GUI though right?
          • prmoustache 417 days ago
            Today's version of icewm takes around 16mb of memory, xorg will add a bit to it.

            There are smaller window managers but I choose this one as an example as it gives a similar experience to the windows xp of olds.

            I have done the experience on slimming as much as possible a desktop. But once you start a web browser with more than 3 tabs memory usage goes through the roof. In the end if you want to run an old system with 512mb of ram you are kind of forced to use the web sans javascript and images. You are almost better off using links or w3m and tui apps for everything. Netsurf can work too if you are limiting the number of tabs open.

            One a 1GB system you can definitely use a modern web browser but you definitely need the ad/trackers removal extensions and have to take good care of not opening more than 2-3 tabs or you will start swapping a lot.

            • anthk 417 days ago
              1GB user there. 4-5 tabs it's fine under Luakit, ZRAM (1GB virtual + 1GB physical = 2GB) and a hosts blocking file.
      • vitorgrs 417 days ago
        Which distro? Last Linux distros I tried wasn't exactly lightweight RAM wise.
    • Genbox 418 days ago
      I've worked on several projects where performance was an afterthought. After the product scaled a bit, it suddenly became the highest priority - but at that time, it was impossible to fix. At least for everyone that created the problem to begin with.

      I've taught high performance data structures to dev teams. I've tried to explain how a complex problem can sometimes be solved with a simple algorithm. I've spent decades on attempting to show coworkers that applying a little comp-sci can have a profound effect in the end.

      But no. Unfortunately, it always fails. The mindset is always "making it work" and problem solving is brute-forcing the problem until it works.

      It takes a special kind of mindset to keep systems efficient. It is like painting a picture, but most seem to prefer doing it with a paint roller.

      • wizofaus 418 days ago
        And I've worked on systems where months were essentially squandered on performance improvements that never paid off because we never grew the customer base sufficiently for them to be worth while...

        I'm all for dedicating time and effort towards producing performant code, but it does come at a cost - in some cases, a cost of maintainability (for an extreme example there's always https://users.cs.utah.edu/~elb/folklore/mel.html). In fact I'd suggest in general if you design a library of functions where obviousness/clarity/ease-of-use are your primary criteria, performance is likely to suffer. And there are undoubtedly cases where the cost of higher-grade hardware (in terms of speed and storage capacity) is vastly lower than that of more efficient software. I'd also say performance tuning quite often involves significant trade-offs that lead to much higher memory usage - caching may well be the only way to achieve significant gains at certain scales, but then as you scale up even further, the memory requirements of the caching start to become an issue in themselves. If there were a simple solution it would have been found by now.

        • Genbox 417 days ago
          Performance is not the same as efficiency, and efficiency can't be solved with more hardware.

          Let's say I build a sorting algorithm that is O(N^2) complexity and works fine for small inputs (takes <1 millisecond), but it is going to be used for large data systems. Suddenly it takes hundreds of thousands of hours to sort the data.

          One of the corps I worked with went full scalability in their architecture. One-click deployments, dynamic scaling of servers, rebalancing of databases, automatic provisioning of storage. They were handling 40-50k requests pr. second with their 15-ish large server farm, which could sale down to 5 servers, or up to 50-ish before it began to wobble.

          I got called in because the company had gotten a large client that needed 100k requests pr. second. They tried scaling the system to fit the need, but the whole thing got unstable and their solution was "more operations people to manage it".

          I built a custom solution for the backend. Took about two months. The new system could do about 2100k requests pr. second on one server. Scalability of the new system was ~90% efficient as well, so lots of capacity for the future.

          None of their developers understood computers or the science behind them. They were all educated and experienced developers, but none of that were applied to the problem. They were just assembling parts from the hardware store until something worked, and the resulting Frankenstein's Monster was put into production.

          • wizofaus 417 days ago
            I'm struggling to believe any single server could usefully service 2100k (well over 2 million!) requests per second. Even Google, with their vast farm of servers, reportedly only process less than 100k requests per second globally. I've certainly read of servers capable of handling in the order of 1000k requests per second as a benchmark, but the requests are usually pretty trivial (the one I saw literally did no input processing at all, and just returned a single fixed byte! But was written in Java, surprisingly.) At any rate, I would think a tiny % of real-life systems actually need to be able to support that sort of load, and bringing in somebody to do the scalability work once it's clear it's needed seems like exactly the right strategy to me.
            • Genbox 415 days ago
              Not serving, but handling 2100k requests. Your skepticism is rightly placed, as the HTTP protocol is yet an example of an inefficient protocol that nonetheless is used as the primary protocol on the internet. Some webservers[1] can serve millions of requests pr. second, but I'd never use HTTP in code where efficiency is key.

              No, I'm talking about handling requests. In this particular case, requests (32 to 64 bytes) were flowing through several services (on the same computer). I replaced the processing chain with a single application to remove the overhead of serialization between processes. Requests were filtered early in the pipeline, which made a ~55% reduction in the work needed.

              Requests were then batched into succinct data structures and processed via SIMD. Output used to be JSON, but I instead wrote a custom memory allocator and just memcpy the entire blob on to the wire.

              Before: No pre-filtering, off-the-shelf databases (PSQL), queue system for I/O, named pipes and TCP/IP for local data transfer. Lots of concurrency issues, thread starvation and I/O bound work.

              After: Agnessive pre-filtering, succinct data structures for cache coherence, no serialization overhead, SIMD processing. Can saturate a 32 core CPU with almost no overhead.

              [1] https://www.techempower.com/benchmarks/#section=data-r13&hw=...

    • tester756 418 days ago
      > has not grown in complexity by four orders of magnitude in 20 years.

      How about all those sandboxes, protections and mitigations?

      Nowadays people care about security waaay more than people did 20-30 years ago.

      • gbin 418 days ago
        Unfortunately more like nagging features, crapware, etc...
    • NovaPenguin 418 days ago
      My go to on this was I remember running Debian on a Pentium 166 with 32MB of RAM back in 98/99. It would boot to the desktop only using 6MB. It wasn't flash but it could handle the basics. Heck Windows XP would boot to Desktop using a little under 70MB.

      But this isn't just Windows, currently I am on Kubuntu 22.04 and it is using about 1.5GB to get to the Desktop! Yes it is very smooth and flash but it seems like a bit much to do this.

      This is why I am interesting in projects like Haiku and Serenity OS, they may bring some sanity back into these things.

      • vbezhenar 418 days ago
        Minimal debian on VM eats something like 20-40 MB of RAM. Not 6 MB, but still rounding error by today's standards.

        I guess that with careful selection of GUI components one can fit empty desktop to 60 MB.

        Until you start browser anyway.

        • NovaPenguin 416 days ago
          I guess nowadays it is a choice, back then it was all we had! :D

          But good to know that we can still 'Hyper-mile' our OS.

    • api 418 days ago
      My favorite is Geos for the Commodore 64:

      https://en.m.wikipedia.org/wiki/GEOS_(8-bit_operating_system...

      Obviously there were huge limitations but it shows what can be done. This fit on one 170K floppy and ran on a 1.44mhz 8 bit machine with 64K of RAM.

      In the 1990s I ran both Linux and Windows on less than 64M of RAM with IDEs, web browsers, games, and more.

      If I had to guess what were possible today I’d fall back on the fairly reliable 80/20 rule and posit that 20% of todays bloat is intrinsic to increases in capability and 80% is incidental complexity and waste.

      • weinzierl 418 days ago
        For me also the Commodore came to mind. It had 64K RAM and a 64K address range, because other things had to fit in there not all RAM was usable at the same time. Clock frequency of the PAL model was 985kHz (yes KILO), so not even a full MHz.

        Yet, I could do

        * word processing

        * desktop publishing

        * working with scanned documents

        * spreadsheets

        * graphics

        * digital painting

        * music production

        * gaming (even chess)

        * programming (besides BASIC and ASM I had a Pascal compiler)

        * CAD and 3D design (Giga CAD [1], fascinated me to no end)

        * Video creation [2]

        For all this tasks there were standalone applications [3] with their own GUI [4]. GEOS was an integrated GUI environment with its own applications and way ahead of its times [5].

        It still blows my mind how all this could work.

        My first Linux ran on a 386DX with 4M of RAM, but this probably as low as on can get. Even the installer choked on that little RAM and one had to create a swap partition and swapon manually after booting but before the installer ran. In text mode it was pretty usable though, X11 worked and I remember having GNU chess runnning, but it was quite slow.

        [1] https://youtu.be/ZEf9XMrc5u8

        [2] OK, this one is a bit of a stretch but there actually was Videofox for creating video titles and shopping window animations: https://www.pagetable.com/docs/scanntronik_manuals/videofox....

        [3] Some came on extension modules which saved RAM or brought a bit of extra RAM, but we are still talking kilobytes. For examples see https://www.pagetable.com/?p=1730

        [4] Or sort of TUI if you like; the strict separation of text and graphics mode wasn't a thing in the home computer era.

        [5] The standalone apps were still better. So, as advanced GEOS was, I believe it was not used productively much.

        • nine_k 418 days ago
          Sure, Commodore could do all that!

          But if you had to use that software now, you'd say (justly) that it's extremely basic and limited, and that interoperability with other systems is not great.

          What can be done ≠ what's comfortable to use.

          • weinzierl 416 days ago
            Fully agreed. When I tried my old Commodore a while ago I couldn't stand the 50Hz screen flicker for long. Unbelievable that back in the day I spent hours on hours in front of that stroboscope.

            For me it's more about the excitement that the bright future lay ahead of us so clearly mixed with a slight disappointment that I sometimes feel we could have made more out of it.

          • api 418 days ago
            That doesn’t account for the sheer monstrosity of modern software. You don’t need gigabytes of RAM for those things.
            • brewdad 418 days ago
              The state of things would seem to suggest otherwise.
      • alpaca128 418 days ago
        To me the most impressive recent example is a video editor developed for Haiku OS [0]. It fits on a 1.44MB floppy disk.

        [0] https://github.com/smallstepforman/Medo

    • jodrellblank 418 days ago
      Eight Gigs And Constantly Swapping.

      Zawinski’s Law - every program on windows attempts to expand until it can be your default PDF viewer. [cloud file sync, advertising display board, telemetry hoover, App Store…]

    • 1vuio0pswjnm7 418 days ago
      2GB is a ridiculous amount of memory for something like an OS.

      When we see egregious examples like Windows, then it's arguable having constraints might be desirable. It is well-known that "limitation breeds creativity". It's certainly true outside of "tech" companies. I have witnessed it first hand. "Tech" companies are some sort of weird fantasy world where stupidity disguised as cleverness is allowed to run rampant. No more likely place for this to happen than at companies that have too much money.

      • 1vuio0pswjnm7 416 days ago
        Many of them do not need to turn a profit and a small number have insane profits due to lack of meaningful competition (cf. honest work). With respect to the later, it's routine to see (overpaid) employees of these companies brag on HN about how they do very little work.
    • zozbot234 418 days ago
      The standards were also a lot lower back then. Modern-day users expect high resolution and color depth for their screens, seamless hardware support no matter what they plug into the machine, i18n with incredibly complex text rendering rather than a fixed size 80x25 text mode with 256 selectable characters, etc. These things take some overhead. We can improve on existing systems (there's no real reason for web browsers to be as complex as they are, a lot of it is pure bells and whistles) but some of that complexity will be there.
      • darknavi 418 days ago
        And they also expect their programs from 20 years ago still run.
    • norman784 417 days ago
      You can achieve good memory footprints with Linux, just 2 or 3 years ago I was daily driving Arch linux with bspwm as a window manager, it used only 300 mb, for me is pretty darn good, but as soon as I opened my vscode with a JS project my ram usage was at 12gb. We have a lot of bloatware everywhere, that’s pretty sad.

      edit: This remind me a some rants from Casey Muratori about VS[0] and windows terminal[1]

      [0] https://youtu.be/GC-0tCy4P1U

      [1] https://youtu.be/hxM8QmyZXtg

    • Aerroon 418 days ago
      I remember taking Windows XP to below 100 MB of RAM usage.
      • bArray 418 days ago
        I remember needing to get Windows XP under 64MB of RAM so that I could run some photo editing software. XP was relatively feature complete, I don't think Windows currently ships with 32x the features of XP (64MB vs 2048MB minimum).

        Linux with a lightweight GUI for example can still run okay with just 128MB. I ran Debian with LXDE on an old IBM T22, and it worked perfectly well. Running Firefox was a problem (but did eventually work), but something more stripped down like NetSurf or Dillo is blazingly fast.

      • karmakaze 418 days ago
        Then that's where it started to double. I was running Windows NT 3.51 running Visual C++ on 32MB.
    • phendrenad2 418 days ago
      Imagine the first person to realize that you can write bloated software faster than highly-optimized software. Competitive advantage over his peers.
    • ShaneMcGowan 418 days ago
      We don’t need to worry about memory efficiency until we stop getting gains via hardware improvements. For now developers can just slap a web app into some chromium based wrapper, make sure their code doesn’t have any n^2 in it and you’re good to go.
      • Wowfunhappy 418 days ago
        Tell that to the person on a fixed income who has to invest in an expensive new machine because their 2015 laptop (which still has a whopping 4 GB of memory and a CPU that would have been top-of-the-line twenty years ago) has become unusably slow.

        Software efficiency is a serious equity and environmental issue, and I wish more people would see it that way.

        • NovaPenguin 418 days ago
          This is why I argue that one of the best things that the Free/Libre software developer community can start doing is optimizing for lower spec machines. Microsoft and Apple are either too closely nit or drectly provide hardware to be prolonging the liftime of hardware they sell. In optimizing open OS's it can prolong the life time of hardware by a significant margin and it means that lower in come folks are not left in the dark. I don't just mean in well off countries, but if you are in the lower classes of the global south - there is no other option.

          These was (is? - Not sure) a version of Firefox for PowerPC MaxOSX - TenFour Browser - that brought forward modern features/support of Firefox to Macs that were long past their prime. They mentioned that their favorite story in time of development was "One of my favourite reports was from a missionary in Myanmar using a beat-up G4 mini over a dialup modem; I hope he is safe during the present unrest. "

          http://tenfourfox.blogspot.com/2020/04/the-end-of-tenfourfox...

          This is what can happen when things are optimized for the people, not the business. This is part of why I still use a Core 2 Duo as my daily runner, if it ain't broke don't fix it.

          • RulerOf 418 days ago
            >This is why I argue that one of the best things that the Free/Libre software developer community can start doing is optimizing for lower spec machines.

            But isn't the primary application for these machines going to be the web browser, which is pulling in so much JS insanity that the web sites won't render well anyway?

            • NovaPenguin 416 days ago
              Yeah, that is unfortunately a big part of it. Via NoScript I generally run the web very lean but there is only so much one can do.
        • DrewRWx 418 days ago
          It is depressing that I had this same argument in college a decade ago and people are still so cavalier about not optimizing their code.
          • Gigachad 418 days ago
            Because the person on a 2015 budget laptop isn’t the one paying their wage to optimise apps.

            Companies will invest in what pays the bills. And hyper optimising for customers with no money isn’t it.

          • josefx 418 days ago
            To be fair, if you forced programmers to write efficient code you would just make everything more expensive and flood the market for unskilled labor with university graduates that can't find their own ass.
            • Wowfunhappy 418 days ago
              If it really did come down to that, I would still rather people had to pay more for software and less for hardware, because software has a comparatively minuscule environmental impact.
            • MrYellowP 418 days ago
              Actually no. If programmers actually learned how to properly program the machines, we'd not be in the mess we are in right now. Abstraction is the cancer that got us to where we are.

              Nobody has any actual clue what they're doing, everyone keeps writing code for the compiler hoping for the best and the rest of the world has to buy new machines because the programmers of the last decades sucked.

              That, btw, includes most of you people reading this. You're fucking welcome.

        • nine_k 418 days ago
          No need to invest into an expensive new machine; a device from 5 years ago, with some more added RAM, would already be pretty adequate. Typing this from a Thinkpad T470 which was introduced in 2017, which is my main workhorse machine.

          A top-of-the-line laptop CPU from 20 years ago likely just doesn't support addressing more than 4GB or RAM. Forcing it to work on modern resource-heavy Web pages and media is like forcing a GPU from 20 years a go to run Skyrim. It's just not adequate.

          • Wowfunhappy 418 days ago
            20 years ago is pushing it a bit. But 12 years ago, in 2008, I used a computer with 4GB of RAM in order to:

            • Read the news

            • Post on social media

            • Make video calls

            • Use instant messaging

            • Create and edit word documents/presentations/spreadsheets

            Today I use my computer for all of those same things... and yet they all require drastically more memory (and CPU, GPU, etc). What happened, and how does this benefit consumers? Yeah, modern web pages are resource-heavy—but to what end†?

            In some cases, the requirements really did change. For example, I can now watch videos in 4K; my 2008 computer could handle 1080p, but I imagine it wouldn't have handled 4K as well. However, I suspect many users of old machines would be perfectly happy to drop down to a lower resolution.

            ---

            † Something I find amusing in all this... people often say they're glad Flash applets died because they were slow. Nowadays, instead of Flash, we use browser apps written in Javascript. I wonder how "slow" those apps would run if you threw them on a computer from the Flash era. (This isn't to discount other problems with Flash, although I do think it has a worse reputation than it deserves.)

            • vbezhenar 418 days ago
              You can use computer with 4 GB of ram today for all things you've mentioned. It might swap here and there and not be as snappy, but generally it'll work.

              I think that Apple just recently stopped to sell 4 GB computers. And their phones from the last year sells with 4 GB RAM while being perfectly able to do all the things you've mentioned as well.

              • josephg 417 days ago
                Yeah, I agree - I don't think ram is usually the problem.

                I used to have a 2016 dual core macbook pro with integrated graphics and 8gb of RAM or something. The machine was great when I got it, but 18 months ago it was limping along and I finally decided to get rid of it.

                And it wasn't any 3rd party apps that killed the machine. Every time the machine started up, iphotoanalysisd or some random spotlight service or something would be eating all my CPU. It was always a 1st party Apple app which was making it slow. And the graphics felt laggy. Just moving windows around felt bad a lot of the time, even when I didn't have anything open. Xcode would sometimes lag the machine so much that it would drop keystrokes while I was typing. I had RAM to spare - it was a CPU problem.

                In the process of wiping the machine, I booted into Recovery mode and it booted the 2016 recovery image of macos. Holy smokes - the graphics were all wicked fast again! I spent a couple minutes just moving windows around the screen in recovery mode marvelling at how fast it felt.

                I wonder if reverting to an old version of macos would have fixed my problems. As far as I can tell, this was all Apple's fault. They piled up macos with so much crap that their own computers couldn't cope with the weight. I also wonder if they broke the intel graphics drivers in some point release somewhere along the way, or they started relying on GPU features that Intel's driver only had software emulation for.

                Modern macos still has all that crap - the efficiency cores in my M1 laptop are constantly spinning up for some ridiculous Apple service or something. But at least now that still leaves me with 8 P-cores for my actual work. Its ridiculous.

                I bet linux would have worked great on that old laptop. I wish I tried it before turfing the machine.

              • Wowfunhappy 417 days ago
                Sure, it's possible to get by on 4 GB of RAM today, but it used to work a lot better!

                Compare the memory usage of:

                • 2008-era Skype and iChat vs Slack, Teams, and Discord.

                • 2008-era web pages (including with Flash embeds) vs modern web pages.

                • Microsoft Office 2007 vs current Microsoft 365.

                And it's not only or even primarily memory, but also CPU requirements and so on.

        • wizofaus 418 days ago
          While I do agree with this, it seems worse than that - I've observed with a number of systems that used to run well 5 or so years ago that they simply don't any more, even with exactly the same OS and essentially the same software. I don't know to what degree that is because of actual hardware deterioration (or least, file system fragmentation), vs additional gumpf getting automatically installed and slowing it down (but every time I've tried to remove such gumpf, it hasn't really helped), or even because of user perception (but I don't buy that this explains cases of apps that now take over 30 seconds to start up, when they used to take 5 at most). I have one 8+ year old Windows 7 machine in particular that I use for music streaming, and basically can't be used for 30 seconds at least after logging in - but then seems mostly fine after that.
          • Wowfunhappy 417 days ago
            "Windows Rot" is definitely a thing but it can be cleared out by doing a clean reinstall of the OS. While this can be time consuming, you'd likely be doing it anyway if you got a new machine.
            • wizofaus 417 days ago
              No idea where I'd even find an installer for Windows 7! It does make me wonder whether upgrading it would actually help. But for now it works well enough I'd rather not risk it (the other thing I use it for is some old software that requires a FAT partition for its licensing to work!).
              • Wowfunhappy 417 days ago
                I have a retail copy of Windows 7 on a DVD! But yeah, if you didn't buy it back in the day I'm not sure where you'd get it now.

                Windows 10 (and I assume 11) has an option to "refresh" Windows in Settings.

        • arbitrage 418 days ago
          Expecting your 8+ year old laptop to run as well as it did when it was new is completely unreasonable.

          That has never been a reasonable expectation in the history of computing.

          • Wowfunhappy 418 days ago
            Why? Are the types of things I want that laptop to do different today than they were 8 years ago? Sure, apps and websites are heavier, but I'd posit the things most people do on their computers haven't changed in a decade at least.

            > That has never been a reasonable expectation in the history of computing.

            Yes, but again, why? As I see it, everyone has been conditioned to this lie that computers naturally slow down over time, because that's the way it has always been relative to the speed of current software. Originally, that was for a good reason—I'm glad programs now use full-color GUIs. But now?

            What would actually happen if Moore's law ended tomorrow, and we were no longer able to make computers faster than they are today? I suspect that a (slim) majority of computer users would actually benefit. Not hardcore gamers, not scientists, and certainly not software developers--some people really do need as much performance as they can get. But for the people who just need to message friends, write documents, check email, etc., the experience would be unchanged—except that their current computers would never slow down!

            • josephg 417 days ago
              I absolutely agree. It seems like most software developers only start optimizing code once our software starts feeling slow on our top-of-the-line development machines. As a result, every time we get faster computers we write slower code. When the M1 macs and the new generation of AMD (and now intel) chips came out 18 months or so ago, I spent big. I figured I had about 2 years of everything feeling fast before everyone else upgraded, and all the software I use slowed down again.

              Years ago while I was at a startup, I accidentally left my laptop at work on a Friday. I wanted to write some code over the weekend. Well, I had a raspberry pi kicking around, so I fired up nodejs on that and took our project for a spin. But the program took ages to start up. I hadn't noticed the ~200ms startup time on my "real" computer, but on a r.pi that translated to over 1 second of startup time! So annoying! I ended up spending a whole morning profiling and debugging to figure out why it was so slow. Turns out we were pulling in some huge libraries and only using a fraction of the code inside. Trimming that down made the startup time ~5x faster. When I got into the office on monday, I pulled in my changes and felt the speed immediately. But I never would have fixed that if I hadn't spent that weekend developing on the raspberry pi.

              Since then I've been wondering there's a way to do this systematically. Have "slow CPU tuesdays" or something, where everyone in the office turns off most of our CPU cores out of solidarity with our users. But I'm not holding my breath.

          • ndriscoll 418 days ago
            I've never expected my computer to run worse over time. There's no real mechanism for that to even happen; it works fine until it fails completely.

            Recently it's become less possible to run the same software for 10+ years because so many things are subscription only and have unnecessary networking, which makes it necessary to patch security flaws, and then you have to accept whatever downgrade the vendor forces on you.

            Older applications that you used to be able to just install run just as well as they did the day they came out on the hardware available at the time. The idea that computers "get worse" is entirely a phenomenon of the industry being full of incompetence. Even (or perhaps especially) programmers at FAANG companies are just not very good at their jobs.

            Check out the argument Casey Muratori got into with the Microsoft terminal maintainers about how slow the thing was. He got the standard claims about how "oh it's so complex and Unicode is difficult and he's underestimating how hard it is", so he wrote a renderer in a few hours that was orders of magnitude faster, used way less memory, and had better Unicode support.

            • Dalewyn 418 days ago
              There is (or at least was) some truth in computers getting worse over time.

              File system fragmentation was a very significant problem when most people still used HDDs as their primary mass storage media. SSDs are far less affected by fragmentation because of much faster random access times, but HDDs and thus performance suffered.

              The Windows Registry is an arcane secret not even Microsoft fully comprehends at this point, and it can get very messy if a user installs and uninstalls lots of programs frequently. This is, of course, a problem with uninstallers not uninstalling cleanly and not a problem with Windows or the users. With so much crap moving to Chrome online-software-as-a-service outfits, users aren't (un)installing as many programs as frequently anymore, but an unkempt Windows installation can definitely slow down over time.

              Software in general also just gets more and more bloated as the moons pass. More bloated software means less efficient use of hardware, meaning less performance and more user grief over time.

      • retrac 418 days ago
        I have a netbook from around 2010. It has 2 GB of RAM and a single core Atom processor. It boots to a full Linux GUI desktop in a minute or so. It can handle CAD software, my editor, and my usual toolchain, if a bit slowly. It even handles HD video and the battery still holds a 6 hr charge.

        But it doesn't really have enough RAM to run a modern web browser. A few tabs and we are swapping. That's unusably slow. A processor that's 5 or 20x slower is tolerable often. Working set not fitting in RAM is thrashing with a 1000x slowdown. And so this otherwise perfectly useful computer is garbage. Not enough RAM ends a machine's useful life before anything else does these days, in my experience.

        • anthk 418 days ago
          Enable ZRAM. I run luakit with just 1GB of RAM and a compressed GB of ZRAM.

          Atom n270 netbook, go figure.

          Also, run this to get a system wide adblocker:

              git clone https://github.com/stevenblack/hosts
            
              cd hosts
          
              sed -n '/Start Steven/,$p' < hosts > hosts.append
          
              sudo cat hosts.append >> /etc/hosts
          
          EDIT: wrong URL
        • teo_zero 417 days ago
          Same here, except I fine-tuned the kernel to boot under 10 s.

          Of course it can't run all today's bloated software, but we're talking about the operating system, here, not the applications.

      • est31 418 days ago
        That's fine for those desktop users which don't care about spinning fans, but many users are on laptops, and care about battery life. An inefficiently coded app might keep the CPU in high levels even if it's absolutely not required for the app because it is just a chat app or such.
      • colinsane 417 days ago
        > For now developers can just slap a web app into some chromium based wrapper […]

        making 10% of users unreachable in order to more easily reach the other 90%. yeah, it’s a fine business strategy. though i do wish devs would be more amenable to the 10% of users who end up doing “weird” things with their app as a result. a stupid number of chat companies will release some Electron app that’s literally unusable on old hardware, and then freak out when people write 3rd party clients for it because it’s the only real option you left them.

      • cmrdporcupine 418 days ago
        Prices per GB of RAM are starting to plateau, or at least not fall as quickly:

        https://aiimpacts.org/trends-in-dram-price-per-gigabyte/

        DRAM density and cost isn't improving like it used to.

        Also memory efficiency is about more than just total DRAM usage; bus speeds haven't kept pace with CPU speeds for a long time now. The more of the program we keep close to the CPU -- in cache -- the happier we are.

      • MrYellowP 418 days ago
        You're part of the reason why we're stuck in the mess most people, actual idiots, don't even acknowledge as a mess.
        • ShaneMcGowan 418 days ago
          True but the products I build in this inefficient way solve other messes so you win some you lose some
    • kristianp 418 days ago
      I was recently boggling at how a running small python script could take up 30+ MB of RAM. The data it processed was probably < 100kb in total.
      • Gigachad 418 days ago
        You are getting a whole runtime and standard library bundled in. The whole point of python is for quick and dirty scripts because saving you 4 hours is worth more than using 20mb less ram for something that gets run a couple of times.
    • xvilka 417 days ago
      Using Web technologies for places where they don't belong killed the performance.
    • muyuu 417 days ago
      early expectations on code interfacing and re-usability failed catastrophically

      in my previous job rather than give people root access to their laptops we had to do things like running a docker image that ran 7zip and we piped the I/O to/from it, and I'm not kidding we all did this and it was only bearable thanks to bash aliases and the fact that we had 16GBs of RAM

    • at_a_remove 418 days ago
      Indeed. When I de-bloated Windows NT workstation, I could reliably get it down to fourteen megabytes of RAM.
    • aaron695 418 days ago
      [dead]
  • Someone1234 418 days ago
    This removes WinSxS. That's fine for embedded, since you'd just package the DLLs you need with any executables you want to run, but trying to run this as a general purpose OS is a fools' errand. Calling WinSxS "bloat" when that "bloat" is allowing 30+ years of backwards compatibility (and a lot of stuff will break) is creative by the article's author for sure.

    Nothing wrong with Tiny11 though, if you know what it is good at and use it for that. Namely, "offline" Windows for some appliance-like usage (e.g. factory controls, display screens, et al) when Linux won't do for whatever reason and licensing Windows IoT isn't possible (small business/personal project/etc).

    • binarycrusader 418 days ago
      The idea that removing WinSxS saves space is generally misguided anyway. The vast majority of content there is actually the original file that was used to create a hard link at the destination. So obviously removing the file doesn’t really save any appreciable amount of space.

      The remaining content unique to WinSXS is either for cryptographic validation, app compat, or the driver stack.

      • asveikau 418 days ago
        Came here to say this.

        WinSXS looks like a huge folder in explorer, because explorer's size estimates do not tell you about hard links. It's not that big. I need to question somebody who thinks removing it will remove a lot of bloat.

      • forgotpwd16 417 days ago
        WinSxS also includes backups allowing to revert updates and disabled features. Cleaning up using dism has always recovered me a few GBs.
        • binarycrusader 417 days ago
          The amount of backups in there is fairly minimal and is limited to a folder called “backups”. As for disabled features, the size of disabled features there is also relatively small, not gigabytes.

          Any space reclaimed using dism’s startcomponentcleanup is only from removal of superseded updates which normally happens automatically whenever the maintenance task runs after a certain period of time.

          Note that I explicitly consider backups separate from superseded updates. Superseded updates are kept for a period of time to allow the user to uninstall a newer update.

      • kristianp 418 days ago
        Hmm, so how can you see the real size of winsxs?
        • forgotpwd16 417 days ago

            dism.exe /Online /Cleanup-Image /AnalyzeComponentStore
          
          Gives both apparent (seen in Explorer) and actual size.
        • binarycrusader 418 days ago
          Brute force?

          Get the size of every “file” in the volume along with the file id of each and then subtract the size of any files with a matching file id that are in the WinSxS. Then sum the size of the remaining files that were from the WinSxS.

          You can get the file id using https://learn.microsoft.com/en-us/windows/win32/api/winbase/... or fsutil, etc.

          You could also probably execute “fsutil hardlink list” for every “file” in WinSxS and then ignore any that list more than one result and sum the size of the remainder.

          There are of course more efficient ways to do this but those are some quick hacks.

    • Semaphor 418 days ago
      In their defense, they also don’t recommend running it on anything that can actually run normal Win 11.
    • twelvedogs 417 days ago
      it does more than that, it keeps a copy of every dll you've ever installed not just the ones in use. there's a reason it just gets bigger forever even if you don't install more programs
    • 8note 418 days ago
      how much backwards compatability do i neee nowadays?

      my laptop only needs to run a few things:

      browser vscode steam the microsoft drawing app some office stuff sublime discord

      which all update pretty regularly.

      the age of the desktop app has been replaced by the age of the browser and electron based apps. i can imagine businesses who built their own set ups back in the age of the desktop app being stuck with it, but for the most part i dont think i used windows' backwards compatability anymore

      • zamadatix 418 days ago
        They may update regularly but that doesn't mean they only use modern features. E.g. even just Steam itself (not just games in it) is largely still 32 bit on Windows requiring gigabytes of 32 bit compatibility files using interfaces going back decades even though Windows 11 itself doesn't have a 32 bit version anymore.
        • terrycody 418 days ago
          Oh I got your point, that's why this thing exist!
        • terrycody 418 days ago
          Sorry but does it mean some games can't play on windows 11 steam?
          • TingPing 418 days ago
            Windows has no 32bit release but 64bit Windows has everything to run 32bit software.
          • connicpu 418 days ago
            You can; you wouldn't be able to if you stripped your windows install of all the files that still allow 32-bit programs to run
      • Dalewyn 418 days ago
        A lot more than you think:

        * Steam (the root process, not the subsequent Chromium child processes) is 32-bit, as are a lot of games.

        * Discord is 32-bit.

        • pas 418 days ago
          How come people run these Electron apps separately when they run pretty well as tabs/windows in the browser?
          • dariusj18 418 days ago
            Because a long time ago Microsoft lost a lawsuit when they tried making the web and Windows more integrated
          • Ar-Curunir 418 days ago
            Because integration into the desktop is better as an Electron app. Eg sound and video calls, keyboard shortcuts, not having to worry about finding your Discord/Slack/whatever tab
            • Dalewyn 418 days ago
              I would't say Electron is better desktop integration.

              Discord for example is literally just a chrome-less Chrome; the zoom in/out hotkeys in Chrome still work in it.

              This is also not mentioning how no Electron program ever visually adheres to the desktop environment it's running in.

              • smileybarry 417 days ago
                > Discord for example is literally just a chrome-less Chrome; the zoom in/out hotkeys in Chrome still work in it.

                No, it also includes:

                * Voice (and text chat) overlay for games (DLL)

                * Game integration via lobby & rich presence APIs

                * Krisp noise cancellation (requires a DLL as well)

                * (Better) screenshare (Chrome has an API for window sharing now but Discord's is a bit more robust with several backends in case one fails)

                * System-wide keybinds

                * Scripting support via gaming accessory apps (Logitech G HUB, HyperX, etc.)

                It also (anecdotally) works faster than the web client, in my experience.

                Just because the zoom controls work (which is an accessibility feature) doesn't mean it's a barebones Chrome wrapper.

              • oefrha 418 days ago
                Electron apps can use desktop capabilities. Web apps are at the mercy of the few desktop-bridging APIs that browsers inconsistently expose. They’re not talking about UI/UX “integration”.

                Discord for instance has this “currently playing game X” feature. I have zero interest in broadcasting what I’m doing at the moment to the world, but many do and have this feature enabled. Good luck implementing that in a browser-confined web app.

              • leonidasv 418 days ago
                Electron can call native OS APIs the browser can't.

                An example: https://stackoverflow.com/a/39569062

          • poopooracoocoo 416 days ago
            I think it's because the apps have additional functionality and because the services push users to use the apps on their websites. Some of the additional functionality is artificially limited to apps as companies can put more tracking, advertising, and can ensure that people won't leave their service easily by just closing a tab.
          • rewgs 418 days ago
            I guess I'm one of the few who doesn't. Discord, Slack, Spotify, etc -- they're all just bookmarks for me.
          • encryptluks2 418 days ago
            Because browser's haven't built enough compatibility with the desktop to use it like a regular app, therefore severely limiting---sometimes intentionally---what you can and can't access on the file system. It is expected sometime in the near future that browsers will have enough sandbox protection that they will then enable app developers to do the same things that only Electron allows but without the excessive bloat you get from Electron versions.
      • Rebelgecko 418 days ago
        What's the use case for needing to run steam but not individual games? Are you using desktop sharing with an ec2 instance or something?
        • pas 418 days ago
          i'd be happy to hunt DLLs for games instead of installing every damned VCredist and directx package globally.

          basically a Proton/NixOS for Windows :)

        • Marsymars 418 days ago
          Not a Windows system, but I run Steam on my Mac purely for in-home streaming from my gaming system. No games installed.
      • MuffinFlavored 418 days ago
        Isn't it interesting that both you and I frequently use Sublime and VS Code? Why can't VS Code kill off Sublime? It's interesting to me where a text editor like Sublime can't be a preferable IDE, but an IDE also isn't a preferable text editor?
        • speg 418 days ago
          Sublime is, well, Sublime.

          Every time I try VS Code I just can’t commit. There’s a bit too much going on and it never feels as tight as Sublime.

          I did just get a new Windows machine, so maybe I should try it on that.

          • nickpeterson 418 days ago
            I have the same feelings. Sublime has really good, “let me visually manipulate text through cursors” functionality. The find all, regex highlighting, and multi cursors are really nice. These features are in nearly every editor but they always feel crappy to use compared to sublime.

            I’m starting to get back into emacs recently though because I like fiddling with tools more than productivity.

          • MuffinFlavored 418 days ago
            When you are using Sublime, do you have the ability to right click somewhere and "go to definition" in a big code base?
        • tkuraku 418 days ago
          I've migrated to vscode from sublime. Vscode integrated debugger, lsp, etc are way ahead of sublime. That said, typing in sublime just feels good compared to vscode. I keep sublime around for quick edits.
      • bombolo 418 days ago
        > steam

        Do you want steam to actually run any game? :D

    • JoeAltmaier 418 days ago
      Um. Where else do you need it to run in 2GB but embedded? I think the article is informative, fair and correct.
      • djur 418 days ago
        "De-bloated" implies that the stuff removed is "bloat", i.e. worthless. I wouldn't assume that a "de-bloated" install was any less suitable for general purpose computing tasks.
        • pixl97 418 days ago
          I'd consider that a poor assumption. If you try to use this to install a wide range of applications you run into one of two issues, you rebloat the system, or some things fail to run.
          • Thorrez 417 days ago
            I think the core question is what is the definition of bloat.
      • demetrius 418 days ago
        > Where else do you need it to run in 2GB but embedded?

        I use an eeePC laptop with 2 Gb or RAM as my home computer. It's quite usable with Linux.

        Not that I'm planning to install Win11 on it, but the assumption that 2GB is enough only for embedded devices is incorrect.

        • JoeAltmaier 417 days ago
          I stand corrected. Embedded is all over the map, but a brief search shows you have to get well above the $10 embedded SOC to get RAM like that.
      • NikolaNovak 417 days ago
        I have a media pc running windows 10 with 2gb ram. It's run great with media player classic, Netflix, and even steam installed. I certainly would not assume "debloated" means "completely crippled"

        I think that's the point - some people have assumption that 2g is meaningless whereas others see it as HUUUGE amounts of memory. Never mind historically, let us consider what a modern phone can do with 2gb ram.

    • causality0 418 days ago
      Calling WinSxS "bloat" when that "bloat" is allowing 30+ years of backwards compatibility (and a lot of stuff will break) is creative by the article's author for sure.

      Taking up a lot of space on your drives for data to maintain backwards compatibility makes sense. Why, when not being actively used, does it need to occupy gigabytes of RAM?

      • Someone1234 418 days ago
        > Why, when not being actively used, does it need to occupy gigabytes of RAM?

        There's no need, which is why it doesn't.

        • ninkendo 418 days ago
          Given that the premise of this discussion is how Tiny11 credits removing WinSXS as part of the reason they were able to free up the memory, it would appear the article (and OP) disagree.
          • Dooflegna 418 days ago
            The article talks about memory savings and storage savings. Removing WinSXS is a storage savings play, not a memory savings play.

            Here's the relevant quote:

            > Moreover, removing the Windows Component Store (WinSxS), which is responsible for a fair degree of Tiny11’s compactness, means that installing new features or languages isn’t possible.

      • ChuckNorris89 418 days ago
        Caching frequently used apps for snappiness.
  • worble 418 days ago
    It really staggers me the lengths some people will go to try and preserve something that is actively against them, when there are alternatives right there.

    I'm not saying Linux is for everyone, but the kind of people creating and running these scripts really should have no issue daily driving Ubuntu or even Arch. Or if they desperately need photoshop or whatever, get a mac.

    It's like watching people constantly go back to an abusive relationship.

    • userbinator 418 days ago
      It really staggers me the lengths some people will go to try and preserve something that is actively against them, when there are alternatives right there.

      The same can be said for those working on jailbreaks and the M1 Linux project, as well as all of the cracking/hacking scene. For some people, it's far more interesting and enjoyable to fight --- and possibly win --- than just "abandoning ship".

    • qwezxcrty 417 days ago
      Well, maybe I'm a minority for having a EE/physical science hobby, but also belong to the kind of people you are referring to.

      I'm pretty stuck to Windows as I need it to drive my home lab. I need to run Windows to

      1. Get data from a old optical spectrometer. It was designed for optical endpointing of plasma etching. And one will have a hard time finding anything that is not running Windows in a fab (except lithography).

      2. Run a 28 years old piece of software to acquire timestamps from a HP 53310A modulation domain analyzer

      3. Grab frames from an old xray detector

      4. Work with two NI DAQ cards. Yes, they are supposed to work on Linux, but I always get weird errors on my Ubuntu work computer while they never failed me on my Windows laptop.

      5. Use Autodesk Inventor to prepare files for 3D printer/machine shop. Siemens NX used to work on Linux, but apart from that, there is not a single piece of non-toy 3D CAD software that I'm aware of support Mac or Linux.

      6. LTSpice simulations and Altium Designer layouting.

      Windows is the only first class citizen in many areas, software development and artistic work are two exceptions.

      And so far, it seems I can still always be one step ahead of MS in the anti-consumer war, so I'm not too worried.

    • martin_a 418 days ago
      > Or if they desperately need photoshop or whatever, get a mac.

      I'm kind of in that situation and I don't thing going with Mac and the Apple ecosystem really is better than trying to use Windows 10 as long as possible on an older Thinkpad.

      • dbtc 418 days ago
        What about the gimp?
        • martin_a 417 days ago
          Didn't really get the hang of it to phrase it that way.

          Everybody who's using tools like Photoshop professionally has been "shaped" to feel well in the Adobe ecosystem. I doubt that's good but that's how it is.

          Photoshop, Illustrator, InDesign, they all feel and work similiar which helps with transitioning/switching between these tools without big issues.

          Now take Gimp, Inkscape and Scribus against that. Everything looks different and probably works different, too. I need to get work done, not learn three seperate programs. Also Scribus seems to be dead, latest Dev blog entry is from 2016.

          Serif is doing great work with Affinity, but Adobe is still going strong and defines the professional industry. As long as that's the case we're stuck with Windows/Macs for professional work.

        • avx56 418 days ago
          *The GIMP™
    • rejectfinite 417 days ago
      This is not for regular Windows 11 useage... This removes WinSXS so cannot be upgraded, apps wont work etc etc
    • nateb2022 418 days ago
      Agreed. My initial response to any post beginning something like "On Windows XP..." "On Windows 7..." "On Debian..." would be like: "Well you already have Windows XP/7/Debian/whatever. If you want to use that, use that. Nobody is forcing you to use Windows 11."

      For the people who do want to use Windows 11, and who see it for what it is, it's pretty great. For the people who use Windows XP/7 or who stick to some minimalistic un-featured XFCE-running underpowered Linux machine, you do your own thing. No need to force that on everyone else.

      • out-of-ideas 418 days ago
        debloating does not mean "making it like XP/W7" - it means ripping out the horsecrap and unnecessairy components that are both unnecessairy and a waste of space and being able to control what goes on your system - sort of like what nix allows us to do; it also means having options to turn things on and off, ect.

        for the non tech savvy - windows is still a great choice for those wanting to simply game and not learn something new like linux - these are the same folks that do notice a difference in OS being bloating and see ads, while asking for help knowing others know more; which a lot of us do not have the time nor energy to fully support a vast array of friends' systems. these debloated windows are great for those folks, and for me not having to /shurg and have people buy more hdd space for nothing.

        was it not linus himself that mentioned that linux as a popular desktop os will not be a thing until manufactures who provide prebuild OS's (and support them) - ship them with linux? but in all honesty i fell that the X vs Wayland needs to be a bit more solidified, similarly with alsa/pulse/pipewire lol ; but those are different issues

        • fuzzfactor 418 days ago
          For the enthusiasts who are doing the debloating, it's almost like they are gaming the system as they move from one level to another.

          Twenty years ago I had already been installing Windows XP to FAT32 volumes directly to be more compatible with W9x multibooting. I didn't know anybody else doing this (some thought it couldn't be done) but every time I installed XP you can see the names of every driver as it loads during creation of the pre-installation environment. The very last two drivers are FAT32.SYS followed by NTFS.SYS. I figured Windows might have first been made functional on FAT32 but launched with the intention of total migration to NTFS for most people as seen.

          In my later experimentation I found that Vista would run from a FAT32 partition but default Windows 7 would not do it very easily, simply because the WinSxS folder (pronounced win-sucks) was oversized in an insidious way.

          The W7 WinSxS folder size was bigger than Vista's but it did not approach the maximum size that FAT32 can handle.

          Instead it was the un-necessarily stupidly long filenames which overran the long-filename handling ability of FAT32 early when there were enough of them. Like the best engineers would never have even considered doing at the time, much less go into production.

          By judiciously deleting the majority of the contents of WinSxS (but not all by any means), W7 can be run from FAT32 as well without any functional shortcomings as far as my office was concerned.

          The modern approach to testing this for yourself would be to install the default W7 to a regular NTFS volume, then debloat the WinSxS folder manually, perhaps in safe mode or when booted to an alternative OS so none of the files on the W7 volume are in use at the time.

          Reboot to something like the W11 USB setup media, "Troubleshoot" to go to the command prompt (instead of installing W11), then capture (back up) the debloated dormant W7 partition manually using DISM.EXE.

          Then later, on a freshly formatted FAT32 drive, apply the captured W7 system, again using DISM.

          Create new boot files for the newly applied W7 system using BCDBOOT.EXE.

          Boot W7 while it's on FAT32 and prosper.

          Works not that much faster than on an NTFS volume, but if you can reboot to Windows 9x on a multiboot system, you can search the FAT32 W7 volume blazingly faster than when the identical W7 system searches itself while on NTFS.

          Now of course all of this needs to be done in legacy BIOS mode since UEFI alone is not adequate for such continued full PC performance.

          I guess I could have been playing video games instead but reaching this level seemed just as rewarding anyway.

          Wonder if W11 would do this.

          Edit: For extra credit I already put W11 onto old BIOS PC's without any GPT, with regular MBR like it was W10.

          Bypassing hardware restrictions into smaller-than-recommended NTFS volumes using DISM.

          • out-of-ideas 417 days ago
            and i would wager that w11 system runs pretty fast as well

            i do miss the xp's cleanup hotfix cache button though

            I hope you have found yourself the MSMG toolkit for you testings

      • dijit 417 days ago
        > Nobody is forcing you to use Windows 11.

        This is not strictly true, theres tonnes of reasons to use windows, for example I run thousands of gameservers.

        If I can shave memory usage of the OS that translates to a lot of cost savings.

        Windows XP/Vista/7 and soon: 10 being EOL does force me to upgrade.

  • mordae 418 days ago
    I've had access to cheap Windows for years, which is why I kept it around as a secondary OS on my desktop to get around the hassle of getting games run on Linux. Games are mostly play/finish/forget for me anyway.

    But since a few years back, most games I were interested in ran perfectly fine on Linux. I haven't rebooted into Windows for almost a year now. So I think I will, instead of upgrading to 11, eventually delete it and use the second SSD to hold my games on Linux and won't look back.

    I remember the days I have been building a bare metal recovery for some of our Windows systems using WinPE, imagex and Python. There was this feeling of sane people pouring into M$ to modernize the OS a little bit and cool stuff came out. But in the end, it's still the same inscrutable mess it always were. Nowadays with more and more ads and unnecessary fluff that gets in the way.

    • neogodless 418 days ago
      I'm not quite there but... my laptop is mostly just for gaming, with some email, chat and web browsing on the side. So I thought I'd allow Windows 10 to upgrade to 11 and see how it is. (It's not getting anywhere near my desktop!)

      But... Windows 11 is just... annoying. The UI is worse than 10 in all the ways that matter to me. So I finally put Linux Mint on this laptop, and it's been pretty good. Not flawless, but really good. By default, I install and play games on Steam.

      Notable exception is Anno 1800, which has a clunky multiplayer setup anyway, and just doesn't connect under Linux, but works (begrudingly) under Windows.

      Northgard has been awesome, but just tonight I had a bunch of server connection issues - can't 100% blame Linux, though 15 minutes into a multiplayer game, I was dropped while the two Windows players kept playing. But it's not conclusive!

      At any rate, I think for many PC gamers, Linux gaming would work, though it's still not 100% "install, join, play" for every game.

  • haunter 418 days ago
    Not 11 but the "Windows 10 IoT Enterprise LTSC 2021" is significantly better if you want an (official) full fledged Windows OS without the bloat. I'm using that on the Steam Deck w/ dual booting and it's perfect
    • wkat4242 418 days ago
      LTSC is no longer debloated. The latest version comes with the same crapware, Windows app store, "recommended" Microsoft account, telemetry that can't be turned off etc.

      It's just a stable version frozen in time but heralding it as the bloat-free alternative is no longer true.

      I have access to it through work and I gave it a spin recently but it's no longer what it used to be.

      • haunter 418 days ago
        >LTSC is no longer debloated. The latest version

        Which one? Or I guess it's not a public release?

        I have installed IoT 2021 (Windows 10 IoT Enterprise LTSC 2021 Version 21H2) like 3 weeks ago there was nothing, no App Store, less telemetry (there is some but significantly less than on the "normal" Windows versions, but I reinstalled the App Store so maybe because of that)

        *Edited the wording

        It looks like this as a fresh install https://ia904606.us.archive.org/11/items/en-us_windows_10_io...

        • wkat4242 417 days ago
          Ok I have to check which version it was. I don't think it was IoT. I assumed the IoT one didn't have a GUI.

          I will check which ISO it was and reply back here because I'm very curious now too.

        • at_a_remove 418 days ago
          This seems magically good.

          Now, of course, the struggle is to see how I can get the ISO and get it activated. I'm woefully out of the loop.

          • haunter 418 days ago
            As pretty much with a lot of things nowadays Archive.org is your friend https://archive.org/details/en-us_windows_10_iot_enterprise_...
            • at_a_remove 418 days ago
              Thank you! I'm also a little shocked, I had no idea Archive.org was doing this. I'm starting to get the impression that there's quite a lot going on under the hood at archive.org, just reams of stuff I've never noticed ...
              • haunter 418 days ago
                >I had no idea Archive.org was doing this

                Well it's the users that are uploading but yeah Archive.org has an insane amount of stuff

                • kristianp 418 days ago
                  Until someone at Microsoft notices and puts in a DMCA.
    • qwezxcrty 418 days ago
      How you activate that? I think KMS will not work for IoT versions.

      I use KMS activated non-IoT LTSC 2021 on my obsolete Surface. MS will not sell that edition to "consumers" like me, so I don't feel guilty at all for pirating it.

      • haunter 418 days ago
        MAS should work https://massgrave.dev/
      • npteljes 417 days ago
        Windows activation is a simple thing. You need a KMS emulator, and you need to point your Windows to that. If you don't want to set up your own emulator, you can just search the internet for emulators, and point your Windows to one of those. This is what most activators do anyways. But running your own is also easy, if compiling and running a software is easy for you. I personally use vlmcsd.
        • qwezxcrty 417 days ago
          I'm fully aware of the KMS stuff and how it works. But there are windows editions that are not licensed by and cannot be activated with KMS. I don't think there is any public way of pirating them apart from finding a working MAK key. No one seems to have digged into the SPPSvc and the internal working of Product Policy. (If you find something interesting that I missed, please let me know)

          Unlike usual Enterprise editions, I don't think IoT Enterprise SKU will work with KMS. The only possible activation option seems to be with a PKEA key.

          For a machine that never sees the internet, the IoT version runs in Deferred Activation state. So it is useful if it's for a intranet machine that never see the outside.

          • npteljes 416 days ago
            I just realized that you asked to activate a specifically non-kms version. Sorry, must have been tired.
    • DavideNL 414 days ago
      > "Not 11 but the "Windows 10 IoT Enterprise LTSC 2021" is significantly better"

      What about Windows 11, does something like "Windows 11 IoT Enterprise LTSC" exist also? Would that be equally good/debloated ?

      Thank you for the tip btw!

    • BirAdam 418 days ago
      Just learned about this. I wish I could get some ameliorated.info scripts for that build. I might be willing to try Winders again in that case. There are some older win32 applications that I miss.

      I just use Intel’s Clear Linux so… meh.

    • Wowfunhappy 418 days ago
      What is the difference between IoT and normal LTSC?
      • haunter 418 days ago
        Here is a handy reddit post when I was researching the same https://www.reddit.com/r/Windows10LTSC/comments/qw1qrs/ltsc_...

        The main difference that IoT has longer lifecycle support, the activation method is different (but you don't even have to), and the IoT version is only available in english

        But it doesn't really matter they are virtually the same

        • avinassh 418 days ago
          wait, can I use this as a daily driver on a laptop?
          • smileybarry 417 days ago
            You can try but it probably depends on the laptop and how messy your OEM's drivers are. (they might require/depend on some component carved out of the IoT editions)
          • haunter 418 days ago
            I use it to play games! It's totally perfect as a daily driver
  • giancarlostoro 418 days ago
    Honestly, I just gave up after I bought my last computer, installed POP OS and installed Steam and have so far been able to play all my games without a single issue, except The Witcher 3 which only required a configuration change and I was golden. I will use Windows only on work machines but on personal machines its Linux for me from now on.
    • ehcjrvakzjtbe 418 days ago
      youre cursed like the rest of us and you will use windows only on work machines AND GAME MACHINES and linux for personal

      EDIT: spelling and attempted to adjust elegance

      • suprjami 418 days ago
        Spot the guy who hasn't used Proton yet. It runs Windows games better than Windows does.
      • neogodless 418 days ago
        I'm a recent (mostly) convert.

        So far Anno 1800 has an issue where multiplayer games only connect if I play on Windows (but single player runs flawlessly in Linux Mint.) Every other game I've played has been great. StarCraft II (in Bottles), Conan, Valheim, Northgard (so far.)

        PC gaming on Linux is not perfect, but it's really damn good.

        (This is on a Ryzen 7 + Geforce RTX laptop)

        • giancarlostoro 416 days ago
          I was a little afraid when I opened up Legion TD 2 if the multiplayer would work or not, but sure enough it did, no issues! I was genuinely surprised to be proven wrong on my worries.
      • giancarlostoro 418 days ago
        Not really Proton on Steam runs all my games just fine on Linux.
        • BirAdam 418 days ago
          I use an Intel Arc A380 and on Linux it was using DXVK from day one, and therefore the performance issues Winders folks had weren’t a problem for me at all. It did get seriously better with kernel 6.2 recently though.
  • hedora 418 days ago
    I had a similar experience with GrapheneOS on my pixel 6 pro.

    It got multi-day battery life out of the box, which is far in excess of what Google advertises for that hardware.

    Once I installed google play services (which have zero end-user benefit, other than enabling compatibility with apps that have bundled google spyware), battery life more than halved, bringing it in line with what Google claims.

    I suspect anti-trust and consumer protection lawsuits would start flying around if more people realized that over 50% of their phone battery was there to support malicious bundled code.

    • vbernat 417 days ago
      Cloud messaging comes with Google Play Services. That alone could explain the battery difference.
    • cat_plus_plus 418 days ago
      Maybe you are Ok with paying for every app or making do with open sources ones that do not benefit from ad revenue stream. Plus, you don't need maps, pay or cast support. But many other people like this features and if they don't, isn't it great that there are working AOSP builds for Pixel 6 Pro so that they can roll their own ROMs on top of that? No need to hack like for Windows 11.
      • hedora 418 days ago
        Why would maps (get lat/long from GPS chip when navigating / searching), cast and pay need to burn battery when the phone is idle?

        (Also, third party implementations of maps, such as organic maps and here we go can install and run fine without impacting battery life when they are not running.)

        The answer is that the actually-useful features are bundled with mandatory malware that does need to run in the background in order to implement 24/7 surveillance. That bundling clearly violates US antitrust law.

        Also, I suspect most people buying > $1000 phones would be willing to pay 10’s of dollars for lifetime licenses for maps, pay and cast (which is roughly what they would cost as standalone products), especially if they were privacy preserving, and doubled the phone’s battery life.

      • petodo 418 days ago
        maps work perfectly fine on my no gapps phone, so does Whatsapp or signal
  • nerdjon 418 days ago
    I really think Microsoft needs to take a hard look at Windows and realize that it needs the ability to switch, install, or even decide at boot as a purpose built OS.

    Take gaming for example, I pretty much only use my PC for gaming (I prefer my Mac for general purpose stuff) and there is a lot there that is really unnecessary. But where this really becomes an issue is on devices like the Steam Deck.

    I installed Windows 10 on mine, used a debloat script to remove anything that was not strictly necessary for gaming, downloading games, and related tasks and I was able to get better performance and battery life for the same games than I did under SteamOS.

    While I imagine that this would complicate testing of updates to support these separate purposes, it feels like Windows is trying to do to much all at once.

    However I also recognize that much of what I removed is also things like telemetry that I doubt they would remove.

    • eliasdaler 417 days ago
      "and realize that it needs the ability to switch, install, or even decide at boot as a purpose built OS."

      They don't care. This won't bring them money. Showing you ads and tracking you will, so they'll continue doing it.

    • hnarn 418 days ago
      What debloat script did you use, and how did you decide which one to use? My experience is that there is a lot of them out there, and it's impossible to tell which ones actually do something that results in an observable difference, and what the potential drawbacks are of the things being changed.

      A lot of the time I feel like you end up with having to do a lot of research for a very minor practical effect.

      • nerdjon 418 days ago
        I used this one https://github.com/Sycnex/Windows10Debloater and yeah I had to heavily customize it and then I did need to re-enable something afterwords which I found on the github.

        Basically what I did was I started with the default and then unchecked (or checked? I don't remember what the UI called for now) anything related to Xbox and the Store and I didnt have any issues.

        I also did a comparison before and after and it was actually a pretty decent improvement. About a 10fps improvement over SteamOS and a Normal Windows 10.

        For me the biggest incentive was being able to play xbox game pass games and not needing to worry about any compatibility issues with Proton which is why I went down that route.

        But yeah your second part is very true. I feel, the impact is minimal if you are on a traditional PC. But on something with such limited resources like a Steam Deck, the difference can be going from 40 fps to mid 50's and a few more minutes of battery life.

        But it isn't something I would recommend most people do. More just kinda pointing out that with the effort I think Microsoft could make a lean Windows really just by taking a look at what is actually necessary to be run for specific tasks.

        • j-bos 418 days ago
          I used that one as well on the machine I use exclusively to log into work through web browser. The change was incredible, battery life shot up from 2.5-3 hours up to 6 plus.
    • forgotpwd16 417 days ago
      >I really think Microsoft needs to take a hard look at Windows and realize that it needs the ability to switch, install, or even decide at boot as a purpose built OS.

      Possible by making your own custom image using dism. Everything debloat scripts are doing can be done before even making an ISO.

    • npteljes 417 days ago
      >I really think Microsoft needs to take a hard look at Windows and realize

      Why though? Microsoft is a business, Windows is a product. It works well as a product, sales is good, deployments are many. Why should they reconsider the current strategy?

  • college_physics 418 days ago
    There really should be a penalty for gratuitous waste of resources. Energy consumption is neither free nor without adverse impacts. The era of planed obsolescence, bloatware expanding to gobble up and justify ever more hardware is over.

    A fresh take on the desktop given the monstrous devices that are now available in terms of cpu, memory etc may completely redefine the footprint of personal computing

    • 72deluxe 418 days ago
      Try telling that to the users of Slack and other modern "desktop" applications that are simply web browsers with poorly-implemented UI elements attempting to mimic native UI elements (like menus).

      I wish your idea took off, but the modern "developer" (even with insane amounts of funding) seems capable of only writing memory-hogging garbage.

      • shortcake27 418 days ago
        > the modern “developer” seems capable of only writing memory-hogging garbage.

        They’re perfectly capable of writing good software but actively choose not to. 1Password is the perfect example - lots of money, good engineers, a team who proved actually did write beautifully implemented native applications. Then they switched to Electron so they could avoid double-handling, and now users are faced with a laggy, buggy, janky, resource-intensive application.

    • nine_k 418 days ago
      > a penalty for gratuitous waste of resources

      It used to be the price of the hardware. Partly it still is: people won't use your software if they can't afford machines that are able to run it. But hardware gets cheaper.

      It's now more about power consumption: both as price of electricity and as battery life.

      But usually the "waste of resources" is "less time spent developing", that is, the users get to use some capability faster

    • phendrenad2 418 days ago
      No, there should not. Otherwise every single app shop on earth is going to he paying it.
    • bob1029 418 days ago
      No there should not. Laggy bullshit software is already bad enough and developing low latency user experiences is only getting harder.

      There are applications that are extremely vulnerable to energy saving crap. Anything real-time is simply going to need to consume more power. Waiting 15.625 milliseconds completely breaks some applications.

      You will take timeBeginPeriod(1) and friends from my dead, cold hands.

  • jug 418 days ago
    A regular Windows 11 will also run on 2 GB of RAM. The official requirement is 4 GB because it'll stutter a lot otherwise, but there is no software block against it. This article claims the de-bloated Windows 11 will run "great" on 2 GB so I guess there's that... But I have to wonder according to which definition of "great" with a lack of benchmarks compared to the official requirements...
  • cat_plus_plus 418 days ago
    Microsoft should just have an option for minimum base install and on demand feature download from cloud. Can't be the only option, because some devices need to be fully featured online, but otherwise wouldn't cost them anything, not even ability to nag people to try more of their freemium stuff. While the advantage for Microsoft would be access to low end Chromebook market while maintaining decent user satisfaction. Also, fewer components on each device means faster and less annoying updates / fewer embarrassing high profile hacks through exploits in components that the customer didn't need. Can also come up with a way to periodically purge components unused for extended time.
    • neogodless 418 days ago
      Like Windows Server, with Roles and Applications?

      Except your options in Windows 11 are "Buzzfeed style widget, remove all sensible taskbar configuration, add extra steps to any context menus, insert ads for OneDrive, Office, and Candy Crush at every opportunity!" Who would add them? :)

      But seriously, the way Windows Server handles it is just great. Windows 11 could potentially have a more minimal install.

  • y-c-o-m-b 418 days ago
    Something like this would've been awesome a decade ago for Windows 8 when I was doing testing on netbooks with poor hardware.

    I'm on Windows 10. After running O&O ShutUp10, this OS has been as good as Windows 7. I can do all my software dev, gaming, video editing/graphic design, etc. on this operating system without issue. I don't think it's ever crashed on me. It genuinely makes me curious what kind of issues people are running into. After all these years being a power user of Windows, so far it's been a smooth ride. The last time I had trouble was with Windows XP and Windows ME before that. I skipped Vista and Windows 8 (except for working on netbooks at Intel back in the day).

    I tried Windows 11 in October 2021 and it was an awful mess. Tried too hard to appeal to MacOS and Linux users when that ship has long sailed. Not sure what state it's in now, but I've got no plans to upgrade until I either buy a new device or until Windows 10 gets some serious security vulnerability that's not on Windows 11.

  • hsbauauvhabzb 418 days ago
    My biggest issue is disk bloat, I use many vms on my laptop, windows vms regularly end up 60+gb for a pretty standard deploy (windows+msvs or office). It normally comes down to a bloated winsxs folder, but I feel like attempting to fix that is playing with fire.

    Compare that to Linux, 10gb with a full gui, and minimal bloat after the fact and it’s extremely frustrating.

    • firecall 418 days ago
      Reminds me of the pain point I have with Windows Gaming PCs!

      Both my teenagers have 2x 1TB NVME drives installed to deal with the insane requirements of Steam, Epic and Xbox gamepass games.

      We live in Australia with ~25Mpbs download FTTN, so installing and uninstalling is a huge pain and isnt practical.

      I have a similar issue with Apple selling Macs with 256GB hard drives! Even with iCloud photo and docs offload, these Macs are close to useless as you'll constantly bump into storage issues.

      • poopooracoocoo 416 days ago
        Doesn't help that Epic Games has three instances of Chromium. Add that to Steam's and all of the games' instances and you've easily gotten at least 2 GB of duplicate instances of Chromium. Edge and Edge WebView, at least, are hard-linked, provided that they're the same version.
      • hsbauauvhabzb 418 days ago
        Apple cloud and Onedrive build conflicts of interest resulting in kneecapping local storage.

        Steam is another beast, you could consider a steam cache server or similar, or alternatively teaching your kids how to xfer unused games from primary storage to secondary, and drop a 6-10tb drive in each machine.

  • morpheos137 418 days ago
    Software desperately needs a concept of limited scope and finished version + maintenance. Continually adding features and complexity may make work for software developers but I think at a certain point we enter the negative utility territory.
  • lambdaxymox 418 days ago
    Are the enterprise versions of Windows 11 also filled with bloat?

    I got some keys second-hand for Windows 10 Enterprise LTSC a few years ago, installed it on some ten year old hardware at the time, and I was honestly surprised how responsive Windows could be absent (to the best of my knowledge) the telemetry software, Cortana, etc., and how fast it could boot. It's almost like the true blue good Windows experience without all the nonsense is secretly reserved for only business customers and pirates.

  • unraveller 417 days ago
    This version is bloat compared to superlite 'divinity' (1.5GB iso capable of running smoothly on 1.5GB RAM, not serviceable) and x-lite 'resurgence' (2.5GB iso - fully upgradeable back to standard). Some quirks here and there but less clicks and less hassle than anything official.

    https://youtu.be/Nh7po_P8qNU

  • pancrufty 418 days ago
    It’s a great exercise but this is akin to Alpine Linux: it works for some things but you wouldn’t use it as your daily driver
    • tlamponi 418 days ago
      Well, Alpine Linux actually works with backward compat (as glibc can be easily installed) and uses a few dozens MiB of RAM.

      FWIW, I use Alpine Linux on my pinephone in the form of postmarketOS (an Alpine derivative) with a full-fledged KDE desktop, running Firefox alongside. IOW., you can use it as daily driver just fine, just need to install the respective packages - which naturally makes it use more resources, but even then far from what Windows will use.

    • kytazo 418 days ago
      Well, to be honest, I've been daily driving alpinelinux for quite a while, first on an x86 desktop and nowadays on my apple silicon macbook.

      In my experience alpine is a good fit from anything between an OCI container all the way up to a full fledged desktop or your server. But that's not all, you can have it running on your rpi or even your smartphone as architectures like arm are really a first-class citizen something relatively uncommon even with popular distros like arch which has only a fraction of its packages available for different architectures.

      Alpine may come pretty bare-bones by default but don't let that fool you, its more than capable for at least anything a regular distro is if you know what to do with it. Even if you're a casual linux user you can get it setup in no time by using the setup-* commands that it ships with eg setup-desktop which takes care of setting up a desktop environment without you having to worry about dbus,seatd,compositors or things like that. Also their repositories are filled with almost any package someone would need and can always be coupled with complementary package managers like nix and flatpak in cases where apk isn't enough.

      I love alpine and the aforementioned causes do justice a fraction of reasons, especially when considering things like running on a much leaner and modern c runtime musl instead of glibc, being systemd free and having a minimal dep, bare-bones/ bloat-free philosophy as it was originally intended for use on constrained embedded devices like routers. Its one of, if not the best distros available in my opinion, amongst nixos and gentoo which I deeply respect as well. That being said, one has to factor in the drawbacks some of these features like systemd-free and musl imply when assessing combatibility but I'm having trouble remembering cases where I've ran into deadlocks even on exotic setups like alpine on aarch64 architecture running natively on a M1 macbook with a custom kernel like asahi-linux or a sdm845 oneplus 6T smartphone with pmOS.

    • anthk 418 days ago
      Alpine worked great in my netbook with 2GB of RAM. Advanced browser support, MESA 22 with GL 2.1 for the old iGPU, Libreoffice... everything basic ran faster than any typical distro.

      Alpine with XFCE + dhpcdp-ui as a "WiFi seeking menu" would run circles around Windows 11 using 1/10 of the same RAM. With Bluetooth support with Blueman and everything.

    • npteljes 417 days ago
      Not really the same. This Windows is more like an experiment, like Bellard's Linux in the browser. If you'd like to use a less annoying Windows as a daily driver, LTSC is the way to go.

      https://bellard.org/jslinux/

  • ComputerGuru 418 days ago
    The post mentions that the Windows App Store is still available and can be used to get required apps as needed, but I don't think this is really correct. Component activation plays a huge role in "modern" UAP or WinRT-enabled applications and without WinSxS, I'm not sure how much of component activation will work.

    Obviously basic component activation is functional otherwise the shell wouldn't function (my biggest problem with WinRT/UAP: its insidious creep into the OS "internals" rather than just powering apps, widgets, add-ons, whatever on top of the base system), but I'm not sure how many apps you might pick at random from the app store will still work.

  • ChildOfChaos 418 days ago
    I feel like a new law needs to be named after this, maybe it already exists and I am just unaware of it, but 'basic computing functions and operating system requires expand to match the standard level of computing performance available"

    It's crazy how things that were considered basic many years ago so run well within the performance we have in modern systems, yet the basic system has requirements much higher than it used too.

    Teams is a massive example of this, is just text chat and video conferencing, stuff that was easily done 10 years or more ago, yet there are plenty of systems available today that run it like crap, let, alone imagining to run this on a ten year old system.

    • npteljes 417 days ago
      There's Andy and Bill's law: "what Andy giveth, Bill taketh away". Reflecting on the fact that software will eat whatever resources are available.

      In a way, I think this reflects how life works in general. The way I see it, life expands until there's significant hindrance, or resources are exhausted. I don't mean it in a cynical way, like how Agent Smith does in the Matrix, regarding humanity, I just think that this is the nature of life in general.

      https://en.wikipedia.org/wiki/Andy_and_Bill%27s_law

  • janosdebugs 418 days ago
    Rant: Windows, sadly, seems to be moving more and more into the direction of the user being the product not the customer: you get spied upon, you get ads, you are subjected to changes to the deal without you ability to object. The only exception seem to be the enteprise versions. Darn it, we are paying for the bloody thing!
    • mixmastamyk 418 days ago
      "I am altering the deal. Pray I do not alter it any further."
  • grumpywndw_user 418 days ago
    Windows 10 pushed me to become a permanent macOS and linux resident.

    I don't know what went wrong but Windows 2000 was perfect, with 7 being almost as good.

    • npteljes 417 days ago
      What's wrong is not to realize that these are businesses, not endeavors to create the perfect operating system. Your incentives are just not aligned with Microsoft's, simple as that.
      • 10xDev 417 days ago
        Good thing you can be both a business and still care about user experience.
        • npteljes 416 days ago
          Yeah, I hope to see much of it!

          This got me thinking - I wonder what's the largest commercial entity that I'm considering good in this regard.

  • LinuxBender 418 days ago
    Previous submission [1] Is there a repo with the scripts and tools required to build these images from a bog standard Windows 11 ISO?

    [1] - https://news.ycombinator.com/item?id=34647699

    • RamRodification 418 days ago
      Curious as well. Installing a Windows version that some guy has messed with is out of the question for me. Maybe he has lots of cred on the scene and I'm overly cautious, I dunno.
  • imglorp 418 days ago
    It doesn't take much tinfoil to imagine hardware vendors appreciating bloated OSes: these drive the user up market by necessity. If everything stayed trim and fast, they'd have no reason to upgrade every year. In exchange for this favor, plus a little more grease, the OEMs were more willing to collect the MS Tax on every PC and lock out all others.
  • westcort 418 days ago
    I would LOVE to see a class action lawsuit against Microsoft for intentionally making computers obsolete over time through bloated updates. It has personally cost me thousands and it’s about damn time.
  • MarkusWandel 417 days ago
    I think various Linux distributions would comfortably run in 2GB of RAM even without debloating. Until you start a web browser and open a few tabs. Then it doesn't matter how lean your OS is.
  • squarefoot 418 days ago
    I think the two top questions lots of Windows users would like answered now are:

    0- Did they also remove telemetry and similar malware?

    1- Is it usable for gaming? I mean, didn't they also remove anything important among the cruft? I have memories of shrunk XP "distros" back in the day that were hacked to the point they refused to run a lot of software.

    • nabakin 418 days ago
      I have another one to add: does Windows Update work? I'd like to keep receiving security patches
      • squarefoot 418 days ago
        It seems updating is possible only to some extent and done by hand.

        FTA: "This OS install “is not serviceable,” notes NTDev. “.NET, drivers and security definition updates can still be installed from Windows Update,” so this isn’t an install which you can set and forget. Moreover, removing the Windows Component Store (WinSxS), which is responsible for a fair degree of Tiny11’s compactness, means that installing new features or languages isn’t possible. If you install and enjoy Tiny11, we guess you will have to look out for ISO updates as major feature revisions of Windows 11 arrive."

        • nabakin 418 days ago
          Thank you for the help. That's too bad
    • phendrenad2 418 days ago
      Telemetry isn't malware.
      • BirAdam 418 days ago
        Depends upon your perspective.
      • gigel82 418 days ago
        Telemetry that you can't disable is spyware. And spyware is malware. QED.
        • phendrenad2 415 days ago
          But you can disable it, by not using the software that includes telemetry. QED.
  • pkphilip 418 days ago
    I think there is a great market for a "debloated" windows from MS if sold directly from them
  • jksmith 418 days ago
    Why do I feel like the use case here is "Because unfortunately I can't use linux"?
  • wnevets 418 days ago
    > Moreover, removing the Windows Component Store (WinSxS), which is responsible for a fair degree of Tiny11’s compactness, means that installing new features or languages isn’t possible.

    I don't know if I can consider this "bloat" removal.

    • hedora 418 days ago
      Couldn’t you add tar back in, then package whatever these updates are as tarball patches of C:\, and a regedit script?

      That’d take well under 1MB.

  • harha_ 417 days ago
    I will never upgrade to windows 11. I'm almost completely free of the burden of Windows 10 anyways, 90% of the things I do I can do on Gentoo linux. It's the 10% that forces me to still dualboot to windows...
  • newsclues 417 days ago
    Microsoft could make good money selling tuned versions of Windows for gamers.

    I won't buy a license for a windows gaming box, because it's trash. I would pay for a non-trash version of Windows.

  • bsuvc 418 days ago
    How does this compare to a lightweight Linux distro like Lubuntu?
    • camel-cdr 418 days ago
      My daily driver Debian system idles at 400mb RAM usage. While writing this, it currently uses 1.2GB of RAM, with two browser windows, and about 40 tabs open.

      My vastly less powerful Manjaro arm Laptop with the same setup idles at 160MB.

      Most of the RAM seems to be used by systemd btw.

    • stainablesteel 418 days ago
      i used to run arch on a laptop with only 2 gb of ram and had zero problems for years, it was useful because it got great battery life

      memory was basically a non-issue unless i was trying to compile a large package, i don't recall the precise baseline it had after boot but it was probably around 12-25%

    • asicsp 418 days ago
      Just checked `free` on my Ubuntu desktop: 913MB for a browser with 3 tabs.
    • zozbot234 418 days ago
      Lubuntu is not really "lightweight" in my experience. A really light Linux install uses maybe 0.2 GB RAM at the desktop, which lets you do some very basic web browsing even on a 1 GB system.
      • ComputerGuru 418 days ago
        As someone that has built Linux (and various BSDs) "from scratch," your statement contains the biggest contradiction/problem.

        I can get a fully functionable desktop with GUI apps, generic hardware support (i.e. not locked down to my hardware), support for dynamic modules/drivers/libraries, audio, 3d-accelerated video, and more in a 300 MiB footprint (with only basic iso image compression) and runnable with 128 MiB of RAM.

        Then comes in the last part of your statement: "do some very basic web browsing." The system above works just fine with a browser featuring < 201x tech, with great CSS, JS, HTML support. But if I need to build and bundle the latest Firefox, Chrome, or whatever without manually stripping out a ton of features (beyond what is available via distro package managers), that footprint triples or quadruples in size and the memory requirements skyrocket.

  • temporallobe 418 days ago
    Sure, and you could strip an average car down to its shell, steering wheel, throttle/brake controls, a seat, and probably get away with a 75 hp engine. Neat!
  • imperialdrive 417 days ago
    Booted with less than 512MB ram just fine here. Wild. Technically it booted with 64MB but quickly hit errors. 32MB doesn't fly at all though.
  • mixmastamyk 418 days ago
    Yeah, I remember using NT4 on a machine with 20MB of memory on a lab machine and thinking it was an ungodly amount. A few years later used an SGI with 256MB and thought the same. Actually needed it to flipbook a few minutes of movie resolution frames in RAM. cough
  • debacle 418 days ago
    Windows 11 has some nice features but I am on 10 until they fix some things.
    • tester756 418 days ago
      Which? I've only noticed lack of right click on taskbar, which sucks, but I don't see other things that need fix
      • RamRodification 418 days ago
        Right-click menu on files missing all the actually useful stuff is a huge annoyance for me. I think there's some registry change that somewhat brings it all back, but IIRC it doesn't always give you the full menu anyway.
        • AlfeG 418 days ago
          There is an Explorer Patcher tool on Github. It's doing pretty much great work for fixing Win11 issues!
        • tester756 418 days ago
          oh, I've changed that right away after W11 update with single reg. change and since that it's fine
      • jaredhallen 418 days ago
        The missing option to never combine taskbar buttons is really bugging me.
        • krembo 418 days ago
          Being a pain for me as well, I ended up using apps to fix that like Explorer Patcher, StartAllBack or a registry hack I've just read about.
      • chmod775 418 days ago
        Can I put the taskbar along the left side of the screen yet?
        • AussieWog93 418 days ago
          In all seriousness, this is the main reason I haven't checked out 11 yet.

          Changing the position of the taskbar like this is "OS Smell", and I can't ever see 11 becoming more than the next ME/Vista/8.

        • NayamAmarshe 418 days ago
          No, Microsoft knows better than the user.

          "Nobody uses that feature, what are you, a Linux user!?" /s

        • phendrenad2 418 days ago
          No, but I don't think they're ever going to change that back.
  • k__ 417 days ago
    This is still a thing?

    I remember using build tools to strip down Windows install CDs/DVDs back in the day to get the most performant and minimal installation possible.

  • pedro2 418 days ago
    I never thinned an OS since Windows XP (Black Viper, you rule!). Why put yourself in harms' way by using an unsupported configuration? I briefly consider O&O ShutUp 11 for shutting up Windows' telemetry. AFAI read, works reasonably well, but made Edge not start after some updates.

    I use Windows and Linux, for privacy concerns. If you want privacy, go for Mac (not that I'd do it). On mobile, still working out what are the best options.

  • vmoore 418 days ago
    I just assume this 'release' has no malware or backdoors in it. I love the work of NTDev and even follow them on Twitter, but since this is closed source there's no way of eyeballing what changes have been made. I would run this an offline sandbox VM not connected to the Internet and for trying out different Windows software, but I wouldn't connect it to The Internet.
    • hedora 418 days ago
      The same could be said for official Windows builds, except we know for sure they contain malware and back doors.

      For instance, Microsoft engineers have the ability to pull arbitrary files off Windows 11 machines, at least according to Microsoft press releases from a few years ago. Doing so required “managerial approval”, and was “only for debugging software faults”, but anyone vaguely familiar with the US CLOUD act knows that they’re legally required to provide the same access to law enforcement searches.

      • smileybarry 417 days ago
        > For instance, Microsoft engineers have the ability to pull arbitrary files off Windows 11 machines, at least according to Microsoft press releases from a few years ago.

        I seriously doubt it. Do you have a source for this?

  • nix23 417 days ago
    I just use O&O Appbuster and Shutup, it's safe and the ram usage goes massively down.
  • NayamAmarshe 418 days ago
    Windows at this point only exists to provide worse user experience on purpose and collect as much personal data as Microsoft possibly can.

    Not that I'm complaining though, I don't need Windows anymore and ZorinOS serves me just fine while not making me feel like I'm in a tech prison.

    • qup 418 days ago
      Have you used other flavors? Tell me about Zorin? I'm on Ubuntu out of mostly momentum, but it's slow as hell. I've used Debian, Arch, and Mandrake in the past. I'm a web-dev so I need all my dev packaging etc to work predictably, a good terminal emulator, and ideally firefox.

      I love slim systems so I'd really like to trim some fat.

    • philistine 418 days ago
      Wow, the tech villain from the worst James Bond movie still sells an OS!
  • indigodaddy 418 days ago
    I’m guessing there is no ARM/Pi version for this?..
    • userbinator 418 days ago
      From what I've seen, the ARM version of Windows is quite similar so you may be able to do the same things to it.
  • nirav72 418 days ago
    This might come in handy as a VM.