Interesting to see this when the current top post on HN is someone worrying about Bun as it was acquired by Anthropic. The top comment there describes “Anthropic does experiments on their own codebase, the Bun team is not gonna do the same vibe coding experiments”.
Yet here we are, what looks like a massive undertaking for vibe coding.
Time will tell how this will turn out. Would be nice if the Bun maintainers could give some clarification about what they’re doing here, and why they’re doing this.
They recently tried to upstream an improvement to zig, but were prevented from doing so because zig has a hard and fast "no AI code" rule. Whether you think this response is trying to put pressure on zig or whether they're just moving for practical reasons is up to you.
Makes me wonder why zig announced the strict LLM rule recently. I'm afraid one reason could be that zig doesn't want to accept code from the bun fork in the first place (because of LLM usage, deviation and other reasons)
One non-obvious reason is that an important aspect of their community is to shepherd new contributors [1]. LLMs crushing everything would reduce that. More obvious is all the toil for maintainers dealing with LLM PRs (broadly it’s an issue). The Zig maintainers prefer to put their energy into improving people and fostering those relationship.
It's a combination of pragmatism (not wanting to wade through slop, not wanting to shove out newbie developers) and politics (usual contemporary techie progressive stuff that's now oddly anti-technology).
I was hoping bash because why not. It's AI that has to work and maintain anyway and Anthropic employees aren't limited by 5 hour 7 days limits anyway I suppose.
Rust is legit one of the best languages to "vibe code" in.
The emitted AST has a lower defect rate since it incorporates strong types and in-built error handling. Other pros include native code and portability, but downside is the compile time.
This could be a subjective feeling with no real data to back it up.
People say same about Go as well that it's type system and limited feature set makes it the best AI friendly language but there too, it just seems like a hunch rather than a proven fact.
The thing is that this argument doesn't work with Go because its type system (and the whole language, really) is much less expressive and compiler gives a lot less feedback to the LLM. So it tends to have to write more unit tests and do more cycles of testing (and spend more tokens) to get it right.
As a downside, the compile time is somewhat offset once you're using agents (and especially parallel agents) anyway. Since all of your edits cost a round-trip API call to a third party server, you can accept a slightly slower compile step.
Wow. That xkcd was written in 2007, and part of the dialog is "didn't that [meme] die like five years ago?" Which means All Your Base, as a meme, was already getting somewhat stale by around 2002. It's hard to believe it's been that long.
> what looks like a massive undertaking for vibe coding
fwiw, I suspect it's less of an undertaking than you may think. I've been playing with AI to rewrite Postgres in Rust[0] over the past couple of weeks and I found the AI to be exceptional at doing rewrites. Having an existing codebase you can reference prevents a lot of the problems you have with vibecoding. You have an existing architecture that works well and have a test suite that you can test against
Over the course of a month I've gone from nothing to passing over 95% of the Postgres test suite. Given Jarred built Bun, I bet he'll be able to go much faster
> I suspect it's less of an undertaking than you may think... having an existing codebase you can reference prevents a lot of the problems you have with vibecoding.
Yeah, it's a distinction worth making, and the language for making it kind of sucks. Vibe coding means "AI does the whole thing", or "I use tab autocomplete" depending on who you ask. It's not a very useful term anymore, we need better ones.
My benchmark is basically, "are you letting the AI drive."
In this case, an AI appears to have written the migration guide...
It was and is a perfectly good term, but people started using it without regard for its definition. I don't know why people wouldn't misuse a "better" term the same way.
I do not know if there's any overlap between these teams, but it seems like Anthropic itself is fairly invested in the Rust ecosystem.
They recently proposed some of their internal tools to be the official Rust implementation[0] of Connect RPC[1]. As a protobuf based library set, this includes a new Rust-based protobuf compiler, Buffa[2].
Zig is a moving target. 0.15 -> 0.16 includes some massive structural changes concerning IO and async/threading.
Claude has absolutely no idea what it's doing with bleeding edge zig unless you feed it source and guide it closely (in which case it's useful for focused work) - I'm building a game engine & tcp/udp servers with it and it requires a hands-on approach and actually understanding what's being built.
I imagine these are not really concerns with rust at this point.
In my ideal world the team behind bun would be putting in the work to keep up with modern zig, but it's starting to look like they are running mostly on vibes in which case rust might be a better choice.
> it requires a hands-on approach and actually understanding what's being built.
I think this is true regardless of what language you’re using.
I’ve built a lot in Zig and there’s no difference between vibing stuff in it versus TypeScript/React. Claude can “one-shot” them both, and will mimic existing code or grep the standard library to figure everything out.
Which isn't particularly difficult - the language docs and std source come with the installation, so all you need to do is tell Claude where those directories are in your skill/plugin/CLAUDE.md.
> and guide it closely (in which case it's useful for focused work)
It does struggle sometimes with writing code that compiles and uses the APIs correctly. My approach to that so far has been to write test blocks describing the desired interface + semantics, and asking Claude to (`zig test` -> fix errors) in a loop until all the tests pass.
You're already at a disadvantage having to stuff the context and spend extra tokens coercing the model in the correct direction compared to it already knowing what to do (rust, ts, go, etc.)
Here, I just did a quick test with claude.
1. "make a simple tcp echo server that uses rust"
compiles and runs - took a few seconds to generate.
2. "make a simple tcp echo server that uses zig"
result: compile error, took literal minutes of spinning and thinking to generate
response: "ziglang.org isn't in the allowed domains. Let me check if there's another way, or just verify the code compiles conceptually and present it clean."
/opt/homebrew/Cellar/zig/0.15.2/lib/zig/std/Io/Writer.zig:1200:9: error: ambiguous format string; specify {f} to call format method, or {any} to skip it
@compileError("ambiguous format string; specify {f} to call format method, or {any} to skip it");
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
3. "make a simple tcp echo server that uses zig 0.16"
result: compile error:
zig build-exe main.zig
main.zig:30:21: error: no field named 'io' in struct 'process.Init.Minimal'
const io = init.io;
^~
4. "make a simple tcp echo server that uses zig 0.15"
result: compile error
zig build-exe main.zig
/nix/store/as1zlvrrwwh69ii56xg6yd7f6xyjx8mv-zig-0.15.2/lib/std/Io/Writer.zig:1200:9: error: ambiguous format string; specify {f} to call format method, or {any} to skip it
@compileError("ambiguous format string; specify {f} to call format method, or {any} to skip it");
Rust took seconds and just works. Zig examples took minutes and don't work out of the box. The DX & velocity isn't even close.
i mean, if zig is doing its best (inadvertently) at shooing off slop jockeys, then i already have more confidence that:
1. the language and stdlib are written by people who know what they're doing
2. packages in the ecosystem, at the barest level, are written by those who didn't leave after a few compile errors they couldn't reason about
I wouldn't call any port "prudent". In general, taking mature software and doing any major rewrite is one of the riskiest thing you can do. It is a large scale attempt to fix what isn't broken.
Sometimes it is worth it, but it may also kill projects. A risky move. And AI doesn't help its cause. AI can save a lot of time when making ports, it is one of the things it does best, but it doesn't protect from regressions.
I am not using Bun in production, but if I was, I would consider it a risk. Not because of Rust vs Zig, but for changing things that work.
There is like 1,713 open PR's on the Bun repo. I'm assuming all are from Claude or robobun?. I guess this gives us an insight on what the claude-code workflow look likes. Crazy times.
> I expect OSS to go the opposite direction: no human contribution allowed.
How is it an incorrect interpretation? Jared is indeed pitching/suggesting/predicting that human contribution will not be allowed in the near future, i.e. banned.
Anthropic makes claude, claude can write Rust like a champ and struggles at Zig. It's a straightforward "training data" argument.
I think there are even longer term plays that Anthropic should be looking at, in this space, but it seems like they've decided rust is the right thing, so fair play. I would be (am!) thinking about making an LLM optimized high level language that you can generate / train on intensively because you control the language spec.
Claude doesn’t write Rust like a champ. It’s still miles ahead at js and python than it is at rust. It can do macros and single file optimizations but its gotten really stuck in type hell and tried to dyn everything on multiple occasions for me.
Claude struggling at Rust: not getting types correct, using the wrong abstractions, not implementing things correctly
Claude struggling at Zig: the above + memory safety issues if you run “fast” mode.
It is generally true that Rust code tends to be written in a way that the compiler catches the issue at compile time. The same is not as true for Zig, Python or JS
I'm reminded of the old joke "how to shoot yourself in the foot in 25 different languages". The first one was "C - you shoot yourself in the foot." Zig remains very close to that philosophy.
So the difference is not in writing new stuff but in maintaining the existing codebase. Rust's rigidity makes it potentially harder to break stuff compared to Zig's general flexibility. As a project grows and matures, different types of contributors naturally come in and it's unreasonable to expect everyone to learn about historical footguns that may have accumulated.
100%.
For many people, Bun is the only reason they've even heard of Zig. I'm not in a position to comment intelligently on comparative language features per se, but when it comes to mindshare and community size, Rust is a clear winner.
I would expect all LLMs are going to be better at Rust than Zig - a strong, thorough compiler will simply prevent more mistakes, and the benefits of a "simple" language decreases the larger the code base gets. The more abstractions exist, the less valuable "no hidden control flow" or "no hidden allocations" from the standard library get, and that's before you add the mother of all abstractions of vibe coding.
They do work well. But I still see the occasional type related issue or bug from refactoring that claude will introduce into javascript and python code. It seems to be happening less and less frequently as the models get better. But, the rust compiler catches real bugs in LLM code. I consider that a win.
Has anyone made any cross language benchmarks for LLMs? I wonder if rust's conceptual complexity makes it harder for LLMs to write? If all you care about is working software, which language is best for LLMs? Python, because there's more example code? Go or Java, because they're simpler languages? Ruby because its terse? Rust because of the compiler? I'd love to see a comparison!
But why should they? This just seems like the groundwork for an initial refactor and moving from one language to another. They haven't actually committed to switching from Zig to Rust yet. I mean, I get if you are an investor and you want to see if they are using their time effectively, but why would it matter to anyone else?
They’re not required to do so, but like I said, it would be nice, because it removes a lot of speculation. And development is in the open, so people notice what they’re doing.
Lots of people, me included, heavily invested their time and expertise into Bun, using it as a daily driver, to bundle production code or even using it in production as a JS/TS runtime. Of course, we are interested in Bun to stay a useful tool. The Anthropic acquisition was worrying enough on its own.
But there isn't any change in someone's expertise in Bun though, currently, just in development. Why would they have to dive you into a daily stand-up about their development process?
The definition is at https://x.com/karpathy/status/1886192184808149383 and no that does not match what is in the branch. Systemically migrating a code base using an LLM does not match the defintion of vibe coding.
> I’m seeing people apply the term “vibe coding” to all forms of code written with the assistance of AI. I think that both dilutes the term and gives a false impression of what’s possible with responsible AI-assisted programming.
All language is "coined terms". The point is that if you dilute the definition of a term, you make the term useless. Evolution of a term isn't done automatically. Correcting terms such as these pushed the evolution in a more useful way. Also, evolution of language is not a magic spell that automatically forgives people on making language mistakes.
I think the definition of vibe coding is a bit fluid, in this case I just meant it to be “code fully generated by AI, possibly not fully reviewed by human eyes”. I agree that this definitely not “coding based purely off vibes”, and the approach looks legit.
It depends on what you mean by "vibe coding". Is AI coding based on an existing implementation vibe coding? What about only from a natural-language spec? How does manual reviewing affect whether or not it's vibe coding?
In practice all use of AI rapidly becomes vibe coding. Even if someone says they're going to carefully manually review everything that's generated, within a couple of days they get bored and just click approve.
This is just a matter of priorities - I use LLMs to write code every day and I have never put a single line of code up for review that I didn’t read and understand.
Porting from one typed language to another seems like a perfect use for LLMs. I can see the appeal of both languages and why to consider such an action (e.g., rust is a mainstream PL vs zig's cult status (no slight intended)).
I think the big difficulty here is that Rust's ownership model in particular tends to require certain kinds of control flow to avoid a bunch of weird churning/copying, which makes it not as straightforward of a port target from other imperative languages.
Like maybe you get the LLM to try _really hard_ to churn through everything, but this feels like a big case of "perils of the lack of laziness".
Of course if you have a good idea for how to deal with allocations etc "idiomatically" already maybe that works out well. And to the credit of the port guide writer bun seems to have its explicit allocations that are already mapping pretty well to Rust.
This is all wild conjecture, but I'd assume that teaching the LLM to do that mapping is an achievable goal and then it get's close to automatic -- effectively slurp the source AST into a rust AST and render.
My only experience with ports so far is Python to Go, and it's been near flawless (just enough stupid shit to make me feel justified to be in the loop).
It really isn't if you don't have the right abstractions.
Especially for memory management the right and wrong abstractions in Rust can lead to a factor of 5 or 10 extra amount of difficulty. The right memory management abstraction and your code can be a straight line port (or even cleaner!), the wrong one and you're going to just be spending a lot of tokens to have a machine spin around in circles trying to untie itself
GC'd languages don't have this problem, though obviously you can still generate stupid amount of pain for yourself by doing something wrong
I'm porting a large-ish delphi application to c sharp. It's been pretty hands-off except for converting to async and some language capability mismatch.
Honestly, this kind of thing seems to work quite well with vibe coding. If I remember correctly, the Ladybird JS engine was "vibe-ported" to Rust as well, and it passed 100% of the original test suite, in addition to new Rust tests.
Interesting how times have changed. Back in 2015, the entire Go runtime (already a mature codebase) was rewritten from C to Go semi-automatically: one of the maintainers wrote a C-to-Go conversion tool (for a subset of C they used) so that it compiled and produced identical output, and then the resulting code was manually refactored to make the Go code more idiomatic and optimized. And now you can just ask a language model.
Linked commit is probably not the most convincing for this tagline. Here's a branch[0] of Claude mass rewriting Zig code into Rust which is currently at 773,950 additions and 151 deletions:
The problem with vibe coded re-writes is that you basically sign off on understanding the generated codebase at that point. Any historical knowledge of the codebase is gone.
the rust they've written (so far) is highly unidiomatic (and with a ton of unsafe). I can't speak to the zig part, but it seems plausible to me it is line-by-line, horrendous rust.
Whether or not they can clean it up is an interesting question.
It makes the git history a bit more confusing to follow if you want to see old changes, but I'm sure a simple wrapper to check for the zig equivalent files as well wouldn't be very difficult.
I want zig to succeed but given that zig is not yet 1.x I'd imagine a large code base like bun would have difficulties addressing major breaking changes. Also given the fact that bun is using a fork of zig https://x.com/bunjavascript/status/2048427636414923250?s=20
So I can't tell if the linked commit is an actual attempt or just an experiment but it did always strike me as odd to make a JS runtime in Zig when my impression was there were a lot of work-stopping compiler bugs at the time.
I wonder if a successful, albeit slower, approach would be to walk the git commit history in lockstep, applying the behavioral intent behind each commit. If they did this, I would be interested in knowing if they were able to skip certain bug fix commits because the Rust implementation sidestepped the problem.
this is an interesting idea and i might try it with something smaller. there are more than 15,000 commits to bun, so you’d have to have some sort of way to operate on groups of commits in one prompt to get that done without thousands and thousands of api requests
If nothing, it'll be good marketing material targeted at non-technical enterprise executives so that they pressurize their engineering teams in meetings that look people are porting such complicated things from one different language to totally different language then why are we not using AI effectively?!
The only Bun shipped product I've used in anger is OpenCode and I regularly run into segfaults on it. I doubt this is the reason for migration but every time it happens, it reminds me the real cost of unsafe code. That being said, Zig is an absolute pleasure to write and I can't wait until it has a real library ecosystem, Rust's greatest boon.
That seems totally reasonable but I wonder if there was some head butting in non-public channels given Bun is one of the biggest players in Zig and planned to push through a change like that on their own.
Zig is a moving target that has breaking changes in every release (which is fine as they are sub-1.0). But that means that AI tools have been trained on outdated syntax/etc. Zig isn't that common, so there is even less training data to begin with.
Rust on the other hand is pretty established by now and has less breaking changes. It also has more compile-time safety-guarantees that makes vibe-coding a bit more confident.
In top of that, Zig has rejected their upstream contributions. So they'd have to maintain their own compiler in the long run, which is probably just technical debt to maintain.
Most of my vibe coding is in zig, and it has been my experience that Claude and Codex both keep up with zig changes just fine. Every now and then I catch them writing outdated code that they burn some tokens on, but my experience says your local codebases’s idioms will influence what gets generated enough to stop this from being a problem.
Probably an experiment due to Bun's PRs to Zig being rejected (Zig does not allow AI use). If Rust works well enough, and the alternative is maintaining a fork of Zig, I'd guess they'd go with Rust.
It seems there was an issue where the image API ignored the ICC Profile.(now fixed)
Any developer with experience implementing image formats would almost certainly avoid this mistake. This is a problem that cannot be solved with vibe coding. In this situation, the user is merely a guinea pig for bug fixes.
I'll be very interested in how this AI port turns out. I am involved in a number of active projects that are being held back by the language / framework is holding back the project, but where a rewrite would be too big of a project to undertake by using only human power.
I've had more success vibe coding Rust than I have in more dynamic languages. I suspect the strictness of the Rust compiler forces the AI agent to produce better code. Not sure. It could be just that I am less familiar with Rust so it feels like it's doing a better job.
Bun is the largest project written in zig. And it isn't close. Bun is bigger than zig itself. Seems like zig isn't mature enough to handle Bun's needs, so I don't blame them at all for looking for off ramps. Only time will tell if rigidity from the zig team is worth the cost of losing Bun. It might be.
When I first heard that bun was written in zig, I thought that was an odd choice for such a large project, mostly because the language is "unstable" and is still making significant breaking changes.
I would guess dealing with breaking changes is a big motivation for this.
Interesting. When I thought of Zig, I thought of Bun. In my mind it was the flagship application for that language. Is there another? I wonder how the Zig team feels about this. To me it seems like Rust has definitively won now.
The answer is in the next sentence: "Bun owns its event loop and syscalls." They clearly want to manage their use of threads explicitly, which is not _unusual_ for systems programming but probably less common. Note that `rayon` is different from most of these in that it has nothing to do with async Rust - it's a tool for spreading computation over a thread pool, very popular in non-async projects, but it would also go against their goals here.
It's not really shunned - it's the standard solution for async in Rust - but it's not the right solution for every project, especially if you have specific requirements for how your project's computation should be scheduled. I would guess that Bun is one of those projects, especially as it needs to be able to schedule JS async work itself.
tokio is great and it's pretty performant, but you pay an allocation for every future unless you do some complex organization of your futures.
Source: I worked on Deno, competed directly with Bun on HTTP performance (and won on some metrics).
Edit: and of course I typed future instead of task (aka "spawned future"). Thanks, child commenters below. Much of Deno was built on spawning futures that mapped to promises and doing it as fast as possible. I spent ages writing a future arena to optimize this stuff..
You only allocate on box futures, which are much more rare than naked futures - generally only used where object safety (essentially dyn support) is required. Even then some workarounds exist.
It's an async runtime. The whole async-await flow removes a little bit of scheduling control and adds some forced memory management in order to give you some nicer code in an application case, but if you're trying to build a runtime yourself I think you'd much rather retain control in this case. It's just hard to reason about.
You much rather have this runtime you're building manage task scheduling and allocation and all that. It's the most natural design choice to make.
Tokio is a general purpose async runtime. Much the same could probably be said for async-std (except IIRC they do have a barebones reactor for you to build your own on). In general, a general-purpose async runtime will do worse for highly specific tasks than a purpose-built one (especially e.g. NUMA).
I think avoiding async entirely might be a mistake, and I'm not entirely convinced anything better than a general-purpose async runtime might exist for a JS runtime (it itself is general purpose after all).
Avoiding std::fs is fucking bizarre to me: it's completely sync and is a really lightweight abstraction over syscalls.
my guess is they want to do AI/O as part of their event loop explicitly, and blocking a thread in a syscall waiting for an IOP (ala std::fs) isn't the vibe.
You shouldn't have to pull in big complex dependencies to do what should be primitive things. Zig is putting a strong and thought-out effort into getting async & parallelism "right" inside the stdlib. I'm honestly not up to speed with where rust is at with it at the moment, but last time I checked it was a bit of a mess.
Async is much harder to work with than sync+threading is. And while threads have more overhead in theory, in practice almost nobody is writing applications at such a scale where that overhead actually matters. So I don't blame them for eschewing async, there's likely no benefit for the project in it.
April 26th - Bun announces they used AI to fork Zig so they could make an optimization for a 4x improvement
April 27th - Zig contributor mlugg clarifies why the specific optimizations Bun did were ill advised and wouldn't have been accepted in Zig, regardless of AI use [1]
May 4 - Bun is looking into Rust as an alternative.
This, to me, seems like total whiplash. Has anyone at Bun made a statement on why they're making such dramatic changes? It seems like the lesson to internalize from mlugg is not "switch to Rust"
People are asking why they would switch from zig to rust. I wonder the opposite: why would anyone would use zig?
It's not memory safe. Rust is. I'm curious why anyone would use it for a systems program, especially when AI can write it in rust instead with as much effort. Arguably it would do better with rust since LLMs are trained on more rust.
This feels more like a reaction to Zig's anti-LLM policy than anything. Anthropic would probably like to contribute something back to Zig at some point, but I doubt anyone would ever believe their PRs were not written by Claude.
Exactly, this is a direct response to Zig refusing to accept pull requests from Bun (and Anthropic). That situation forced Bun to maintain a fork of Zig, and it makes sense in the long term that they'd rather port their entire project to Rust.
I've really enjoyed Bun the past year or so, but the acquisition by Anthropic, Bun's codebase and documentation increasingly becoming AI slop, and this impulsive complete rewrite - all of it has ruined it for me and I'm actively moving off of Bun. I don't feel comfortable relying on it any longer.
Yet here we are, what looks like a massive undertaking for vibe coding.
Time will tell how this will turn out. Would be nice if the Bun maintainers could give some clarification about what they’re doing here, and why they’re doing this.
It's probably a bit of both.
[1] https://kristoff.it/blog/contributor-poker-and-ai/
The emitted AST has a lower defect rate since it incorporates strong types and in-built error handling. Other pros include native code and portability, but downside is the compile time.
People say same about Go as well that it's type system and limited feature set makes it the best AI friendly language but there too, it just seems like a hunch rather than a proven fact.
As a downside, the compile time is somewhat offset once you're using agents (and especially parallel agents) anyway. Since all of your edits cost a round-trip API call to a third party server, you can accept a slightly slower compile step.
https://xkcd.com/286/
fwiw, I suspect it's less of an undertaking than you may think. I've been playing with AI to rewrite Postgres in Rust[0] over the past couple of weeks and I found the AI to be exceptional at doing rewrites. Having an existing codebase you can reference prevents a lot of the problems you have with vibecoding. You have an existing architecture that works well and have a test suite that you can test against
Over the course of a month I've gone from nothing to passing over 95% of the Postgres test suite. Given Jarred built Bun, I bet he'll be able to go much faster
[0] https://github.com/malisper/pgrust
That's because it's not vibe coding - stingraycharles doesn't seem to understand what vibe coding is. Vibe coding was defined here https://x.com/karpathy/status/1886192184808149383
> There's a new kind of coding I call “vibe coding”, where you fully give in to the vibes, embrace exponentials, and forget that the code even exists.
This is very far from Anthropic's migration plans.
My benchmark is basically, "are you letting the AI drive."
In this case, an AI appears to have written the migration guide...
with superpowers, i see a lot of specs -> impl plan -> execute plan
I believe now we have all but we fail at choosing.
They recently proposed some of their internal tools to be the official Rust implementation[0] of Connect RPC[1]. As a protobuf based library set, this includes a new Rust-based protobuf compiler, Buffa[2].
[0]: https://github.com/orgs/connectrpc/discussions/7#discussionc...
[1]: https://connectrpc.com/
[2]: https://github.com/anthropics/buffa
Claude has absolutely no idea what it's doing with bleeding edge zig unless you feed it source and guide it closely (in which case it's useful for focused work) - I'm building a game engine & tcp/udp servers with it and it requires a hands-on approach and actually understanding what's being built.
I imagine these are not really concerns with rust at this point.
In my ideal world the team behind bun would be putting in the work to keep up with modern zig, but it's starting to look like they are running mostly on vibes in which case rust might be a better choice.
I think this is true regardless of what language you’re using.
I’ve built a lot in Zig and there’s no difference between vibing stuff in it versus TypeScript/React. Claude can “one-shot” them both, and will mimic existing code or grep the standard library to figure everything out.
Which isn't particularly difficult - the language docs and std source come with the installation, so all you need to do is tell Claude where those directories are in your skill/plugin/CLAUDE.md.
> and guide it closely (in which case it's useful for focused work)
It does struggle sometimes with writing code that compiles and uses the APIs correctly. My approach to that so far has been to write test blocks describing the desired interface + semantics, and asking Claude to (`zig test` -> fix errors) in a loop until all the tests pass.
Here, I just did a quick test with claude.
1. "make a simple tcp echo server that uses rust"
compiles and runs - took a few seconds to generate.
2. "make a simple tcp echo server that uses zig"
result: compile error, took literal minutes of spinning and thinking to generate
response: "ziglang.org isn't in the allowed domains. Let me check if there's another way, or just verify the code compiles conceptually and present it clean."
/opt/homebrew/Cellar/zig/0.15.2/lib/zig/std/Io/Writer.zig:1200:9: error: ambiguous format string; specify {f} to call format method, or {any} to skip it @compileError("ambiguous format string; specify {f} to call format method, or {any} to skip it"); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
3. "make a simple tcp echo server that uses zig 0.16"
result: compile error:
zig build-exe main.zig main.zig:30:21: error: no field named 'io' in struct 'process.Init.Minimal' const io = init.io; ^~
4. "make a simple tcp echo server that uses zig 0.15"
result: compile error
zig build-exe main.zig /nix/store/as1zlvrrwwh69ii56xg6yd7f6xyjx8mv-zig-0.15.2/lib/std/Io/Writer.zig:1200:9: error: ambiguous format string; specify {f} to call format method, or {any} to skip it @compileError("ambiguous format string; specify {f} to call format method, or {any} to skip it");
Rust took seconds and just works. Zig examples took minutes and don't work out of the box. The DX & velocity isn't even close.
1. the language and stdlib are written by people who know what they're doing 2. packages in the ecosystem, at the barest level, are written by those who didn't leave after a few compile errors they couldn't reason about
Zig is a great language and I want to see it succeed, but this is a prudent move for Bun.
Sometimes it is worth it, but it may also kill projects. A risky move. And AI doesn't help its cause. AI can save a lot of time when making ports, it is one of the things it does best, but it doesn't protect from regressions.
I am not using Bun in production, but if I was, I would consider it a risk. Not because of Rust vs Zig, but for changing things that work.
How is it an incorrect interpretation? Jared is indeed pitching/suggesting/predicting that human contribution will not be allowed in the near future, i.e. banned.
I think there are even longer term plays that Anthropic should be looking at, in this space, but it seems like they've decided rust is the right thing, so fair play. I would be (am!) thinking about making an LLM optimized high level language that you can generate / train on intensively because you control the language spec.
Claude struggling at Zig: the above + memory safety issues if you run “fast” mode.
It is generally true that Rust code tends to be written in a way that the compiler catches the issue at compile time. The same is not as true for Zig, Python or JS
So the difference is not in writing new stuff but in maintaining the existing codebase. Rust's rigidity makes it potentially harder to break stuff compared to Zig's general flexibility. As a project grows and matures, different types of contributors naturally come in and it's unreasonable to expect everyone to learn about historical footguns that may have accumulated.
something JS-adjacent could certainly be more known than an obscure language but are that many people using drop-in node replacements?
But I can’t reconcile the reasoning about “strong, thorough compiler” with the fact that LLMs are also fantastic at Ruby.
They also write really great posix shell (including very sophisticated scripts) and python.
Something more subtle is going on.
Has anyone made any cross language benchmarks for LLMs? I wonder if rust's conceptual complexity makes it harder for LLMs to write? If all you care about is working software, which language is best for LLMs? Python, because there's more example code? Go or Java, because they're simpler languages? Ruby because its terse? Rust because of the compiler? I'd love to see a comparison!
Probably similar for Python and Bash, but with a lot more stable training data given their popularity and stability.
It doesn’t look like that at all. Do you think that all use of AI is vibe coding?
https://github.com/oven-sh/bun/compare/claude/phase-a-port
This single commit is 65k lines of additions
https://github.com/oven-sh/bun/commit/ffa6ce211a0267161ae48b...
There's a decent article by Simon Willison that talks about this: https://simonwillison.net/2025/Mar/19/vibe-coding/
> I’m seeing people apply the term “vibe coding” to all forms of code written with the assistance of AI. I think that both dilutes the term and gives a false impression of what’s possible with responsible AI-assisted programming.
"+27,939Lines changed: 27939 additions & 0 deletions"
of new rust code
This is obviously very different from that, but the way the commit looks doesn't make it so.
Like maybe you get the LLM to try _really hard_ to churn through everything, but this feels like a big case of "perils of the lack of laziness".
Of course if you have a good idea for how to deal with allocations etc "idiomatically" already maybe that works out well. And to the credit of the port guide writer bun seems to have its explicit allocations that are already mapping pretty well to Rust.
My only experience with ports so far is Python to Go, and it's been near flawless (just enough stupid shit to make me feel justified to be in the loop).
Especially for memory management the right and wrong abstractions in Rust can lead to a factor of 5 or 10 extra amount of difficulty. The right memory management abstraction and your code can be a straight line port (or even cleaner!), the wrong one and you're going to just be spending a lot of tokens to have a machine spin around in circles trying to untie itself
GC'd languages don't have this problem, though obviously you can still generate stupid amount of pain for yourself by doing something wrong
The slides: https://go.dev/talks/2015/gogo.slide#3
An interesting similarity:
>We had our own C compiler just to compile the runtime.
The Bun team maintain their own fork of Zig too
If they did, I guess they would rewrite deno in C++
[0]: https://github.com/oven-sh/bun/compare/claude/phase-a-port
Whether or not they can clean it up is an interesting question.
https://github.com/oven-sh/bun/compare/claude/phase-a-port#d...
that isn't particularly surprising, but the point is I would expect getting things more stable than the zig version would take a bit.
What is the most interesting here for me is:
- a big, clear outcome and acceptance criteria, vibe coding project on
- a public, working, high performance, full featured, production codebase by
- the leading LLM model maker known for the strongest coding ability
A good example no matter if it successes or not.
Rust on the other hand is pretty established by now and has less breaking changes. It also has more compile-time safety-guarantees that makes vibe-coding a bit more confident.
In top of that, Zig has rejected their upstream contributions. So they'd have to maintain their own compiler in the long run, which is probably just technical debt to maintain.
It seems there was an issue where the image API ignored the ICC Profile.(now fixed) Any developer with experience implementing image formats would almost certainly avoid this mistake. This is a problem that cannot be solved with vibe coding. In this situation, the user is merely a guinea pig for bug fixes.
Sounds like responsible open source software development to me. That's what pre-releases are for.
I've had more success vibe coding Rust than I have in more dynamic languages. I suspect the strictness of the Rust compiler forces the AI agent to produce better code. Not sure. It could be just that I am less familiar with Rust so it feels like it's doing a better job.
> Not sure. It could be just that I am less familiar with Rust so it feels like it's doing a better job.
Ya think?
As a fan of the language, I hope it leads to some reflection on things that might need to change moving forward.
[1] https://ziggit.dev/t/bun-s-zig-fork-got-4x-faster-compilatio...
Haha, is it really okay not to retract that that the official account previously posted a caricature criticizing Rust?
I would guess dealing with breaking changes is a big motivation for this.
As an aside, I've been bitten by Zig's breaking changes on my own projects as well. It's taken the shine off of Zig and I'm looking at alternatives.
I'm not a rust dev but even I kind of notice that tokio is kind of shunned in most projects. Why is that? Is it just bad or what?
Source: I worked on Deno, competed directly with Bun on HTTP performance (and won on some metrics).
Edit: and of course I typed future instead of task (aka "spawned future"). Thanks, child commenters below. Much of Deno was built on spawning futures that mapped to promises and doing it as fast as possible. I spent ages writing a future arena to optimize this stuff..
Edit: and tasks.
You much rather have this runtime you're building manage task scheduling and allocation and all that. It's the most natural design choice to make.
I think avoiding async entirely might be a mistake, and I'm not entirely convinced anything better than a general-purpose async runtime might exist for a JS runtime (it itself is general purpose after all).
Avoiding std::fs is fucking bizarre to me: it's completely sync and is a really lightweight abstraction over syscalls.
However, there are reasons why you might not want to use it:
- You don't need async at all
- You want to own the async execution polling completely
- You want some alternative futures executor like io uring (even though tokio-uring is a thing)
Problem is fanboys like YOU.
April 27th - Zig contributor mlugg clarifies why the specific optimizations Bun did were ill advised and wouldn't have been accepted in Zig, regardless of AI use [1]
May 4 - Bun is looking into Rust as an alternative.
This, to me, seems like total whiplash. Has anyone at Bun made a statement on why they're making such dramatic changes? It seems like the lesson to internalize from mlugg is not "switch to Rust"
[1] https://lobste.rs/s/ifcyr1/contributor_poker_zig_s_ai_ban#c_...
It's not memory safe. Rust is. I'm curious why anyone would use it for a systems program, especially when AI can write it in rust instead with as much effort. Arguably it would do better with rust since LLMs are trained on more rust.
I've really enjoyed Bun the past year or so, but the acquisition by Anthropic, Bun's codebase and documentation increasingly becoming AI slop, and this impulsive complete rewrite - all of it has ruined it for me and I'm actively moving off of Bun. I don't feel comfortable relying on it any longer.
This makes me respect Zig team's stance more, that it's a technical decision more than an ideological one.