This is all great, but while the internet is filled with auto-playing videos and invokes to external scripts from google, meta, amazon and who knows where, the example on this page is about saving... 148 bytes?
Does minifying CSS really make a big difference, in the end? I'm guessing that a page with a serious CSS payload (say, over 100k) already has more serious weight problem elsewhere.
The main problem with CSS is that it is "render blocking". If you want the user to see stuff sooner you have to make the CSS smaller and/or serve it faster (e.g. inline it to avoid another request).
I haven't measured yet but my guess would be that on current devices 99% of the CSS size saving matters for making the transfer of the CSS faster over the network (and that the speed up in constructing the CSS Object Model is negligible).
=> Users with low bandwidth and/or flakey connection benefit
CSS is a high level language used very often as a low level one. The styling doesn't use any "cascading" and everything is repeated all the time. In this context minifying can make a difference. But either way it is not among my favorites.
Does minifying make any difference if you're gzip-ing it in-flight anyway?
Edit: I mean, in the real world. Obviously in their example gzip does pretty much nothing since there's so little content, which leave very little room for compression.
CSS is text based, so it should be gzip (or brotli) compressed. Minifying something isn't as effective as compressing it. You could do both, but the difference is small.
I started linking component stylesheets directly in the component HTML, de facto eliminating unused CSS from the output. HTTP3 provides performance gains we can benefit from by serving multiple CSS files "just-in-time".
I've had great results since. No more unused CSS. No more minifying nor bundling. What you write is what you get. It's liberating! Hehe.
I've used it on a few projects successfully. The property ordering thing mentioned in other comments though is good to be aware of. It's not difficult to switch out for css nano if that is the case. The important feature for me was being able to target a browserlist string and flatten CSS nesting
Is there a tool that can remove unused CSS while considering static HTML?
What I have in mind is a tool that I throw a CSS file and a bunch of static HTML files at, it will try to apply each CSS rule to the HTML like a browser would and remove all the rules that don't apply.
I don't expect it to ascertain if a rule had visible effects. I also don't expect it to consider JavaScript. Just plain CSS and static HTML. It doesn't look to me like CSSnano or LighteningCSS could do that.
https://purgecss.com/ does this, kind of -- it used to be recommended by Tailwind back when Tailwind shipped a giant zip-bomb stylesheet of every possible property/value combination by default. I don't think it does the more complicated browser-like analysis you mention, though; it might just check whether class names appear in your HTML using a regex search.
The AMP WordPress plugin also does something like this IIRC (to try and fit stylesheets into AMP's size limit) but the tooling for it might be written in PHP.
I wrote that for my company ~3 jobs ago, except instead of working only on static HTML, it would: for a small percentage of our traffic loop in the background processing a couple CSS selectors at a time, adding CSS selectors that matched to a bloom filter that would be occasionally POSTed back to the server. Then I'd parse through our logs, pulling out the bloom filters, and comparing them to our CSS rules to find unused rules. It wasn't perfect so it required manually checking before deleting CSS rules but it went a long way towards flagging potentially unused CSS rules.
This would work nicely with static HTML, indeed. But once you have some JavaScript i.e. dynamic HTML, it won't work reliably anymore. Worse, it might even give you a list of manually curated CSS properties to "allow list".
Waaay back there used to be a Firebug plugin that would monitor which CSS was ever used as you interacted with a page. Worked great for exactly these dynamic pages - I was using it with React for a few years before Quantum killed the XUL addons.
I'd forgotten about it and am now wondering if anyone made a replacement...
At least regarding tailwindcss, it checks your code to filter out unused css classes. So it is recommended to have full class names in the code instead of constructing them by concatenating strings. So if you have a variable `buttonColor` that can be `red` or `blue` it is better to do something like
if you are in a big project and want to refactor classes the string concatenation pattern gets extremely painful, it’s even worse when it’s done on the server and on the frontend. At some point you will have trouble removing your classes. My company did this a lot and know we struggle to clean it up since basically all classes are used somewhere. What we do is basically use purgecss and than diff it and look for all occurrences that got removed which will take us a few months.
Eh, even on this example, most of the gains come from simply removing the comments (something that can easily be done with `cpp -E -P <in.css >out_cpp.css` or regexps since CSS doesn't allow comment nesting) on most UNIXes.
Here's a more complete result adding brotli and the aforementioned cpp version:
Can't say exactly why, but I had this suggested by IDEA (?) as well. Could be because background is a kitchen-sink of everything there is in a background including several different DSLs for gradients and such. So it makes sense to keep just background-position if there's a simpler background: the browser will spend less time processing it.
Whether that is measurable is a different story :)
=> Users with low bandwidth and/or flakey connection benefit
Happy to learn otherwise though!
Edit: I mean, in the real world. Obviously in their example gzip does pretty much nothing since there's so little content, which leave very little room for compression.
So maybe another thirty percent, probably from removing comments etc, not just whitespace.
2. CSS often blocks page render, to avoid flashing unstyled content.
3. "Tree-shaking" CSS is often difficult or impossible.
Many (most?) CSS libraries minify their outputs...Bootstrap, Materialize, Semantic UI, Foundation.
I've had great results since. No more unused CSS. No more minifying nor bundling. What you write is what you get. It's liberating! Hehe.
https://lightningcss.dev/
https://github.com/parcel-bundler/lightningcss/issues/547
https://github.com/parcel-bundler/lightningcss/issues/572
What I have in mind is a tool that I throw a CSS file and a bunch of static HTML files at, it will try to apply each CSS rule to the HTML like a browser would and remove all the rules that don't apply.
I don't expect it to ascertain if a rule had visible effects. I also don't expect it to consider JavaScript. Just plain CSS and static HTML. It doesn't look to me like CSSnano or LighteningCSS could do that.
The AMP WordPress plugin also does something like this IIRC (to try and fit stylesheets into AMP's size limit) but the tooling for it might be written in PHP.
https://purifycss.online/
Above is a nice online version of Purify. But it just seems to minimize the CSS, and doesn’t delete the unused CSS.
I'd forgotten about it and am now wondering if anyone made a replacement...
the only thing that does not work is generating the css class like:
var x = 'hase-';
var t = x + 'danger'
which is an antipattern. (it can even happen inside java, c#, whatever language you use and it is still an antipattern)
Here's a more complete result adding brotli and the aforementioned cpp version:
I do find that CSSO[1] with structural optimizations can be more effective still, perhaps that will be less true overtime though.
I think once lightning CSS is more stable (in particular, some bugs are addressed around ordering) it will be the clear winner in this space though.
[0]: https://github.com/cssnano/cssnano/issues/833
[1]: https://github.com/css/csso
Whether that is measurable is a different story :)