This is an incredible achievement... not just for the technical depth, but for what it represents. Alyssa's work is nothing short of inspiring. The way she combined deep technical insight with years of dedication has not only brought open-source graphics to Apple Silicon, but also lit a fire under reverse engineers and open-source developers.
She has shown a whole new generation that curiosity and persistence can break barriers. I thoroughly enjoyed watching the developments these past several years. Massive respect to her and everyone who made this possible, and kudos on her new position at Intel.
What an end to an era. It's crazy to think she started this journey at 18 and now finished 5 years later. Not many people believed they would be able to make the GPU work in Asahi linux. Kinda curious what her "Onto the next challenge!" link means. Is she working for Intel Xe-HPG next?
Wish her the best with this. Intel staying competitive in GPUs can only benefit the consumer. Those who want a mid-tier graphic card, without paying to compete with AI use cases, may not a huge group, but we do exist! Those who use desktop Linux may be a small group among that small group, but we do exist!
Thanks Jesus it's Intel and not Apple, Intel has been extremely good at working upstream and has immense contributions in the Linux kernel, mesa, and elsewhere. Wasting such talent on Apple would make the world worse for us all.
Apple is too much about beeing closed and creating barriers not sure that would have been a good fit. Plus that's a good way to flee a country quickly degrading.
Honestly if Apple had embraced Linux, the Apple Silicon CPUs would have been amazing for all sorts of server, scientific, and AI/LLM work. Too bad they are clamping down on the walled garden to focus on consumer toys instead.
I do have an M2 Macbook running Asahi, which works amazingly well for my casual use, but I think that there is no way that anyone will use last last gen hardware on a volunteer-developed OS for any actual work, server use cases, and so on.
Intel’s core competence is squandering talent by having finance managers and outside consultants make technology business decisions. Something happened to their culture a few decades ago and they forgot that revenue is a trailing indicator of good decisions and you can’t just decide you want to make a lot of money and trust the product strategy to materialize from that.
From "draw a triangle" to upstream Vulkan on M1. Practically, this makes the Venus/virtio path viable for guests on Apple Silicon (no passthrough in VZ), which is what many people actually need.
Although most likely she’s well compensated, and doesn’t have to waste time on useless efforts at work, this level of discipline and striving towards a goal is just very rare in general.
Possibly also no family, limited social life and no other hobbies.
For myself, when I lived on a different continent to my family, had limited social life and job with strictly set hours, it was much easier to have the time needed to make significant progress on a hobby.
However, discipline is an enormous factor too, actually using that extra available time on something “productive” is no easy feat.
Now I have kids and live in the same area as my parents and siblings again, entirely happy, but less free time.
2021 and 2022 was also when many places were only just coming out of COVID lockdowns. I remember how much dead time I had back then. I used it to watch lots of series and youtube videos. I wish I had the discipline and motivation to work like she did during that area with all that free time.
Just to say a big thank-you to the Asahi team, and especially for the GPU work. It is still on my list to get back to some OpenGL dev work. Especially since I recently made fedora-asahi remix my daily driver, and I have to say it is amazing. It feels like I once again own my computer.
Their work has inspired me to continue bashing away at my Zig PinePhone code, although I'll never have the skills to get it's GPU running anything beyond a poke'd framebuffer.
That checklist of supported APIs in Asahi is mind blowing, especially in such a short timeframe. Again, well done, thank-you, and best of luck at intel.
They’re significantly different GPU architectures. They added support for hardware features like mesh shading, raytracing and better shader occupancy/dynamic caching.
Beyond that, each M series generation also brings more of the system into the SoC. For example, the entire storage controller is part of the SoC in the M1, but the M2 brought in the trackpad controller as well.
Bringing more functionality into the SoC has many advantages but it does make it more difficult to target because you can’t just make use of existing off the shelf controller knowledge to apply to it.
What a project. Of all of the IT work that I'm aware of I have a hard time choosing between this and Fabrice Bellard's output, both are - for me at least - equally impressive.
Inspiring stuff! I didn't even expect basic Linux support on M1 to be so good in such a short time-span, leaving graphics aside. I was very pleased when I tried booting up Asahi on M1 a couple months back and went on to get work done in it and even enjoy some games.
Thanks for all your amazing contributions Alyssa and all the best for the road ahead!
They seem closely aligned with the Free Software Foundation (FSF), so I could very well imagine that being a major ideological reason not to want to work with Apple. Yes, Apple sometimes upstream patches and they do contribute to open source here and there, but they certainly are no FSF poster child. Intel on the other hand are about as open as it gets when it comes to their track record in the graphics space. I personally have nothing but admiration for Rosenzweig's work and I hope they will continue to find environments where they can flourish and do great things in the years to come.
Alyssa's post mentions how lots of the work she's done has at least started as side projects while she's working on something else (Panfrost while at high school, M1 drivers while at Collabora). Obviously I'm not her so I can't say anything specific to her. In general, Apple doesn't allow its developers to work on open source projects on the side while employed at the company. I think this is a stupid idea that costs them a lot of talent, but I doubt Apple cares what I think. I've seen multiple cases where an active open source contributor gets hired by Apple, then their presence in open source communities vanishes. Based on all the open source work she's done so far, I think it would take a lot to make her stop all contributions like that.
You do have to wonder how that kind of interview would go. Hopefully it would be actual engineers that created what she reverse engineered instead of some gatekeeper trying to one up her somehow.
May I ask something, I want an apple silicone MacBook Air and I am probably just be running Linux on it, what are pros and cons of getting an m1 vs m2? Except for more ram or so.
Thx
Agreed, it is not that stable/usable.
I tested it on M1 Pro and was hopeful, but after some years I realized it is not viable for daily use. Many things still don't work and I doubt that they will any time soon.
Last year I was given M4 Pro at work and it is not supported at all.
Looking at the drama and people stepping down, I don't think MacBooks will be properly supported on Linux in this decade.
On the other hand, I have an M2 Air and it's stable, fast, and I haven't thrown anything at it that it doesn't handle perfectly. But the fingerprint reader doesn't work.
(The M3/M4 are in progress but not supported. That's public on the project's compatibility chart.)
Are you coming from Windows? MacOS is a BSD descendant so it’s quite Unix-y. I never miss Linux on it and I used to only use Linux. Just learn how to get around the minor annoyances (eg the file explorer sucks , I use eMacs for that) and it’s a fine OS. It’s really not worthwhile trying to install anything else on the Mac.
Sorry to hijack, but since the topic is related: is the development of Asahi Linux still actively ongoing, or has slowed down a lot? The progress for M1 and M2 was steady and now almost everything is done, but the M3+ work still seems to not have started. And with major contributors leaving the project I'm kind of worried for the future of Asahi (on newer Apple hardware).
The new leadership team set a short term goal of getting their existing work upstreamed, which seems to be going well.
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
While we still have quite a way to go, this progress has already made rebases significantly less hassle and given us some room to breathe.
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
So if the discussions are true, it can take years for the developers to finish M1/M2 upstreaming with all the Linux kernel bureaucracy. That is, unless they decide to start working on M3 before finishing the upstreaming
Makes sense, every patch they upstream is less maintenance and forward-porting work that they have to do. Keeping a downstream kernel up to date is very painful, even one that's "near mainline" as with Asahi's.
I would hope not. That would mean that no other vendor has shipped working ARM hardware support for Linux or has upstream support in the kernel. Forget the hostile nature Apple has proven to possess when consumers dare treat their hardware as if paying for it makes it their own.
Qualcomm has been beating the marketing drum on this instead of delivering. Ampere has delivered excellent hardware but does not seem interested in the desktop segment. The "greatest Linux laptop around" can not be some unmaintained relic from a hostile hardware company.
As somebody that has worked in a company that did Qualcomm devices in the past - Qualcomm just cares about money grabbing, and is not any less hostile to developers than Apple.
If you want to do a device, and your only chip option is Qualcomm I'd recommend not doing a device at all.
FLOSS stacks for Qualcomm-based devices are actually a lot more feature complete than some other brands like MediaTek or Exynos. Still nowhere near any kind of "daily driver" status but at least getting somewhere, whilst others have yet to even get started.
> I would hope not. That would mean that no other vendor has shipped working ARM hardware support for Linux or has upstream support in the kernel.
Can you see any other machine coming close to a Mac in terms of hardware quality and performance? Obviously the cost is silly, but while I agree with your sentiment, it seems optimistic to hope.
Networking is going to be another major issue. Even on the Intel MacBook Pro this is still a problem. The instructions for getting it to work are so bizarre that I ended up with a network dongle with a supported chipset instead.
I'd pay easily let's say $100-200 a year to have linux running on modern apple laptops with full features. I'm sure I'm not alone. Their hardware, "our" OS would be perfect. Well, except notch and lack of OLED - but, reportedly that's in the works too.
Most likely, they have more mini-leds and/or more ability to independently control them. Of course the localized "blooming" of mini-leds is a lot easier on the eyes regardless than the all-around bloom of a backlit display.
(Better for the battery too, if you can keep most of the screen dark.)
The M3+ GPU is also very different. So while it may be true that the driver development for M1/M2 is now more or less complete as OP says, future work along the same lines will very much be needed.
This is a pretty well known thing; the M3/A17 generation GPU was a ground-up redesign that added things like dynamic caching and hardware ray tracing [1] which are highly nontrivial to simply extend an existing architecture to support. Unfortunately I can’t find where I read this, but IIRC at the time M2 came out there were expectations that M2 would have a new GPU architecture with hardware ray tracing but this wound up being delayed to M3 because it took longer than expected to do a ground-up redesign of the GPU.
I'm curious, why are not these people hit with C&D from Apple?
And other great projects, like Corellium (Actual iOS VM, not that crap Apple makes) are hit hard with lawsuits etc.
(You know, great project for these people who is still RE iOS for 0days and report them to Apple, which is behind me long time ago, reporting 0days for peanuts, yeah right :) )
If I had to guess. One seeks to reverse engineer hardware to run an open source OS. The seeks to emulate a platform to run a proprietary closed source OS.
If I remember correctly, Apple at the introduction of M1 made some explicit statements about the hardware not being locked down. Something along the lines of nothing preventing Linux to run on it.
I never understood this project. Maybe I'm missing something, but the timescale is such that by the time they're done the product isn't even being sold anymore
At least with Panfrost it made more sense bc it still being used
M1 chip laptops can only be bought second hand at this point
I believe Walmart has a deal[1] with Apple to sell[2] M1 MacBook Airs. This has been the case for a year or so, so I don't think it's old stock. They have been in stock for since that date, and slowly getting cheaper.
They’ve always been available at Best Buy, BH Photo, and other authorized partners in the US.
The Walmart deal is a total mystery. It started, seemingly, as dumping new old stock without selling it on Apple.com, but they’ve even updated the machine I think so clearly it’s an ongoing concern.
Nothing like it I know of for Apple, ever. I’d love to know the story.
Why would the product have to be available new for the project to be worth it? There are still many M1 chips out there, and this helps prolong the usefulness of those chips.
> M1 chip laptops can only be bought second hand at this point
New M1 Macbook Airs are still available at Walmart (maybe elsewhere). But even if not, who cares? People are still writing code for computers that haven't been sold since the 1980s.
Amazing work! Panfrost driver was very impressive to me before, and now just know the lady also solved the GPU driver on Mac. Not sure Intel will be a good career path for her. :)
Im glad she stepped away from Asahi linux. Its absolutely great from a techincal perspective and the progress that team has made, but talented people like her shouldn't be trying to reverse engineer software/hardware from shitty anti-consumer company that can make the entire project work in a heartbeat by publishing documentation, in lieu of building better stuff from the ground up.
Reverse engineering requires a different mindset and somewhat different skill set than “forward” engineering. I’ve met people who were happy to only do reverse engineering (to figure out what make things “tick”) without building anything new.
If it was up to me, 2 years of successful reverse engineering (of a variety of projects/products) would be a requirement to be called an engineer. You learn a lot from working things that you can’t learn from a book (and without having to do the mistakes yourself first…)
Just to make it clear: I am not implying anything about Alyssa - just stating an observation based on my own experience.
> in lieu of building better stuff from the ground up
To be fair, even if you have the best CPU and GPU designers, it's not as if you can call up TSMC and have them do a run of your shiny new processor on their latest (or even older) process. You can't fab them at home either.
Even with proper documentation, there still would have been loads of work to get M1/M2 GPUs working on Asahi Linux. Writing GPU drivers worth a damn is about as difficult as targeting a compiler to a new CPU architecture. It would not be "in a heartbeat".
Honestly kind of heartbreaking to see her leave asahi Linux. She has done insane work building the vulkan driver from scratch. I wish her well working at Intel. If I ever buy an Intel GPU I can rest much easier it will work well on Linux. If she is working on the Linux driver stack that is.
There isn't really anything left to do for her - everything missing (including work on the newer graphics chips) can be somewhat easily done by less talented people, building on her work.
She did the challenging stuff she cares about. One aspect of nerd brain often is that you can hyperfocus on challenging stuff, but can't get the motivation to work on stuff you don't care about - and even what would be a 20 minute task can end up taking days because of that. It's great that she has the self awareness to set goals, and step away once they're done.
I didn't have that in that age - and still sometimes struggle. I was lucky enough that my employer back then recognized my issues, and paired other people with me for doing the stuff I was not interested in, and now usually manage to load those issues onto other co-workers by myself.
This is of course great as long as you can find enough "challenging" work to perform, but any successful project is going to involve a whole lot of seemingly "boring" work. A big part of true maturity and professionalism is being able to find the interesting challenge even in these more run-of-the-mill tasks and successfully engage with them.
(Mind you, I'm not talking about a matter of inborn temperament or character, much less a moral flaw! Rather, finding the compelling challenge even in "boring" tasks is a valuable skill and situational tactic that anyone should explicitly learn about and aim to acquire as part of becoming a mature professional, not a matter of morality or somehow being dismissed as "lazy"!)
Because I have it, untreated. And I couldn't even finish university because of it.
I'm unable to do certain things, like at all, I'm nearly physically ill when doing these things. Hard to explain it, to someone without these problems :)
Luckily enough, it's not that important here / Idc about money, career etc.
Ideally, society would be aware of such people and actually use their potential. AR struck it lucky and so did a few others (cough Richard Stallman cough), but most don't and end up burned out by rigid megacorp structures and processes that don't respect that people, even those one might call "neurotypical", aren't cogs in a machine.
I've said it before and I will keep saying it again: the financialization of everything and the utter dominance of braindead, long-since disproven MBA ideology is going to seriously impede our societies in the next decades.
M3, M4 and soon to be M5 are ready to be cracked open :). From what I understand they are actually different somewhat hardware wise. So it's really not like there would not be a continuations of this work. But of course it's natural to want something else after years of working on the project.
Take into account that she's focusing on the 3D stack, not the overall hardware. Even with hardware differences there's a good chance it's not different enough to make it an interesting new challenge.
Given the features that have been advertised for the M3+ graphics and compute stack, there's rather a good chance that it is different enough to create big, new challenges for third-party support.
Yes I meant the GPU specifically not the hardware in general. An example is the support for hardware ray tracing in M3 and beyond. In some now deleted french fediverse post Alyssa indicated M3 has a new architecture.
She has shown a whole new generation that curiosity and persistence can break barriers. I thoroughly enjoyed watching the developments these past several years. Massive respect to her and everyone who made this possible, and kudos on her new position at Intel.
You can already do this work on M1/M2 using Asahi. A compute server doesn't need fully working peripherals and external displays.
It is accepting a new challenge.
how tf does she juggle and managed to do all this? I can barely do one of the above properly.
Although most likely she’s well compensated, and doesn’t have to waste time on useless efforts at work, this level of discipline and striving towards a goal is just very rare in general.
Possibly also no family, limited social life and no other hobbies.
However, discipline is an enormous factor too, actually using that extra available time on something “productive” is no easy feat.
Now I have kids and live in the same area as my parents and siblings again, entirely happy, but less free time.
Every person is different of course, there might be this one brilliant engineer forced to manage against his will somewhere.
Their work has inspired me to continue bashing away at my Zig PinePhone code, although I'll never have the skills to get it's GPU running anything beyond a poke'd framebuffer.
That checklist of supported APIs in Asahi is mind blowing, especially in such a short timeframe. Again, well done, thank-you, and best of luck at intel.
Beyond that, each M series generation also brings more of the system into the SoC. For example, the entire storage controller is part of the SoC in the M1, but the M2 brought in the trackpad controller as well.
Bringing more functionality into the SoC has many advantages but it does make it more difficult to target because you can’t just make use of existing off the shelf controller knowledge to apply to it.
[1] https://rosenzweig.io/resume-en.pdf
Thanks for all your amazing contributions Alyssa and all the best for the road ahead!
Looking at the drama and people stepping down, I don't think MacBooks will be properly supported on Linux in this decade.
(The M3/M4 are in progress but not supported. That's public on the project's compatibility chart.)
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
https://asahilinux.org/2025/02/passing-the-torch/
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
While we still have quite a way to go, this progress has already made rebases significantly less hassle and given us some room to breathe.
https://asahilinux.org/2025/08/progress-report-6-16/
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
So if the discussions are true, it can take years for the developers to finish M1/M2 upstreaming with all the Linux kernel bureaucracy. That is, unless they decide to start working on M3 before finishing the upstreaming
Qualcomm has been beating the marketing drum on this instead of delivering. Ampere has delivered excellent hardware but does not seem interested in the desktop segment. The "greatest Linux laptop around" can not be some unmaintained relic from a hostile hardware company.
If you want to do a device, and your only chip option is Qualcomm I'd recommend not doing a device at all.
Can you see any other machine coming close to a Mac in terms of hardware quality and performance? Obviously the cost is silly, but while I agree with your sentiment, it seems optimistic to hope.
the great thing is, you can!
Macbook pro display is one of the best laptop display.
(Better for the battery too, if you can keep most of the screen dark.)
Any sources for that? I'd be quite surprised if Apple had radically altered the architecture.
[1] https://developer.apple.com/videos/play/tech-talks/111375/
Well done.
And other great projects, like Corellium (Actual iOS VM, not that crap Apple makes) are hit hard with lawsuits etc.
(You know, great project for these people who is still RE iOS for 0days and report them to Apple, which is behind me long time ago, reporting 0days for peanuts, yeah right :) )
With all of Apple’s secure boot stuff they had more than enough ways tot totally squash running alternate OSes on the machines like a bug.
Instead they seem to have gone out of their way in a few places to make it not only possible but secure.
They’ll NEVER say anything publicly, or give documentation, but they’re leaving doors open on purpose.
If I remember correctly, Apple at the introduction of M1 made some explicit statements about the hardware not being locked down. Something along the lines of nothing preventing Linux to run on it.
At least with Panfrost it made more sense bc it still being used
M1 chip laptops can only be bought second hand at this point
[1]https://9to5mac.com/2024/03/16/walmart-m1-macbook-air-launch...
[2]https://www.walmart.com/ip/Apple-MacBook-Air-13-3-inch-Lapto...
But 8GB of RAM.. that's unfortunately completely unusable by most developers. (Panfrost drivers you can at least use on RPi-like devices)
Maybe in another 5 years it'll work on the M3/4 and I'll revisit this. Good to know the devices are still being built so long after release
The Walmart deal is a total mystery. It started, seemingly, as dumping new old stock without selling it on Apple.com, but they’ve even updated the machine I think so clearly it’s an ongoing concern.
Nothing like it I know of for Apple, ever. I’d love to know the story.
New M1 Macbook Airs are still available at Walmart (maybe elsewhere). But even if not, who cares? People are still writing code for computers that haven't been sold since the 1980s.
Maybe it's just due to a complete lack of attention, but I think M3/4 support is extremely minimal at this point. Which is not a great sign..
If it was up to me, 2 years of successful reverse engineering (of a variety of projects/products) would be a requirement to be called an engineer. You learn a lot from working things that you can’t learn from a book (and without having to do the mistakes yourself first…)
Just to make it clear: I am not implying anything about Alyssa - just stating an observation based on my own experience.
To be fair, even if you have the best CPU and GPU designers, it's not as if you can call up TSMC and have them do a run of your shiny new processor on their latest (or even older) process. You can't fab them at home either.
She did the challenging stuff she cares about. One aspect of nerd brain often is that you can hyperfocus on challenging stuff, but can't get the motivation to work on stuff you don't care about - and even what would be a 20 minute task can end up taking days because of that. It's great that she has the self awareness to set goals, and step away once they're done.
I didn't have that in that age - and still sometimes struggle. I was lucky enough that my employer back then recognized my issues, and paired other people with me for doing the stuff I was not interested in, and now usually manage to load those issues onto other co-workers by myself.
(Mind you, I'm not talking about a matter of inborn temperament or character, much less a moral flaw! Rather, finding the compelling challenge even in "boring" tasks is a valuable skill and situational tactic that anyone should explicitly learn about and aim to acquire as part of becoming a mature professional, not a matter of morality or somehow being dismissed as "lazy"!)
I wouldn't be so quick to judge someone for ADHD.
Because I have it, untreated. And I couldn't even finish university because of it. I'm unable to do certain things, like at all, I'm nearly physically ill when doing these things. Hard to explain it, to someone without these problems :)
Luckily enough, it's not that important here / Idc about money, career etc.
I've said it before and I will keep saying it again: the financialization of everything and the utter dominance of braindead, long-since disproven MBA ideology is going to seriously impede our societies in the next decades.
Trolling will get you banned here, so please don't.
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...