> The juniors working this way compress their ramp dramatically. Tasks that used to take days take hours. Not because the AI does the work, but because the AI collapses the search space. Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced. The time freed this way isn’t invested in another unprofitable feature, though, it’s invested in learning. [...]
> If you’re an engineering manager thinking about hiring: The junior bet has gotten better. Not because juniors have changed, but because the genie, used well, accelerates learning.
I recall similar arguments being made against search engines: People who had built up a library of internal knowledge about where and how to find things didn't like that it had become so easy to search for resources.
The arguments were similar, too: What will you do if Google goes down? What if Google gives the wrong answer? What if you become dependent on Google? Yet I'm willing to bet that everyone reading this uses search engines as a tool to find what they need quickly on a daily basis.
It may well be. Books have tons of useful expository material that you may not find in docs. A library has related books sitting in close proximity to one another. I don't know how many times I've gone to a library looking for one thing but ended up finding something much more interesting. Or to just go to the library with no end goal in mind...
Speaking as a junior, I’m happy to do this on my own (and do!).
Conversations like this are always well intentioned and friction truly is super useful to learning. But the ‘…’ in these conversations seems to always be implicating that we should inject friction.
There’s no need. I have peers who aren’t interested in learning at all. Adding friction to their process doesn’t force them to learn. Meanwhile adding friction to the process of my buddies who are avidly researching just sucks.
If your junior isn’t learning it likely has more to do with them just not being interested (which, hey, I get it) than some flaw in your process.
Start asking prospective hires what their favorite books are. It’s the easiest way to find folks who care.
I’ll also make the observation that the extra time spent is very valuable if your objective solely is learning, but often the Business™ needs require something working ASAP
When I first opened QBasic, <N> years ago, when I was a wee lad, the online QBasic help didn't replace my trusty qbasic book (it supplemented it, maybe), nor did it write the programs for me. It was just there, doing nothing, waiting for me to press F1.
The naturally curious will remain naturally curious and be rewarded for it, everyone else will always take the shortest path offered to complete the task.
Disagree. While documentation is often out of date, the threshold for maintaining it properly has been lowered, so your team should be doing everything it can to surface effective docs to devs and AIs looking for them. This, in turn, also lowers the barrier to writing good docs since your team's exposure to good docs increases.
If you read great books all the time, you will find yourself more skilled at identifying good versus bad writing.
1965: learning how to punch your own punch cards is part of the learning process
1995: struggling with docs and learning how and where to find the answers part of the learning process
2005: struggling with stackoverflow and learning how to find answers to questions that others have asked before quickly is part of the learning process
2015: using search to find answers is part of the learning process
2025: using AI to get answers is part of the learning process
Not really. There’s a pattern to reading docs, just like there’s a pattern to reading code. Once you grasped it, your speed increase a lot. The slowness that junior has is a lack of understanding.
Complaining about docs is like complaining about why research article is not written like elementary school textbooks.
It really depends on what's being learned. For example, take writing scripts based on the AWS SDK. The APIs documentation is gigantic (and poorly designed, as it takes ages to load the documentation of each entry), and one uses only a tiny fraction of the APIs. I don't find "learning to find the right APIs" a valuable knowledge; rather, I find "learning to design a (small) program/script starting from a basic example" valuable, since I waste less time in menial tasks (ie. textual search).
Struggling with poorly organized docs seems entirely like incidental complexity to me. Good learning resources can be both faster and better pedagogically. (How good today's LLM-based chat tools are is a totally separate question.)
For an experienced engineer, working out the syntax, APIs, type issues, understanding errors, etc is the easy part of the job. Larger picture issues are the real task.
But for many Jr engineers it’s the hard part. They are not (yet) expected to be responsible for the larger issues.
what is a larger issue? lacking domain knowledge? or lacking deeper understanding of years of shit in the codebase that seniors may have better understanding? where I work, there is no issue that it "too large" for a junior to take on, it is the only way that "junior" becomes "non-junior" - by doing, not by delegating to so-called seniors (I am one of them)
"Larger issue" is overall technical direction and architecture, making decisions that don't paint you into a corner, establishing maintainability as a practice, designing work around an organization's structure and habit and so on.
But these are the things people learn through experience and exposure, and I still think AI can help by at least condensing the numerous books out there around technology leadership into some useful summaries.
You can't give a junior tasks that require experience and nuance that have been acquired over years of development. If you babysit them, then perhaps but then what is the point? By it's nature "nuance" is something hard to describe concretely but as someone who has mentored a fair few juniors most of them don't have it. AI generally doesn't have it either. Juniors need tasks at the boundary of their capability, but not far beyond to be able to progress. Simply allowing them to make a mess of a difficult project is not a good way to get there.
There is such a thing as software engineering skill and it is not domain knowledge, nor knowledge of a specific codebase. It is good taste, an abstract ability to create/identify good solutions to a difficult problem.
Unnecessary complexity, completely arbitrary one off designs, over emphasis on one part of the behavior while ignoring others. Using design patterns where they shouldn't be used, code once and forget operations exist, using languages and framework that are familiar but unfit for that purpose. The list goes on and I see it happen all the time, AI only makes it worse because it tend to verify all of these with "You're absolutely correct!".
We had 3 interns this past summer - with AI I would say they were VERY capable of generating results quickly. Some of the code and assumptions were not great, but it did help us push out some releases quickly to alleviate customer issues. So there is a tradeoff with juniors. May help quickly get features out, may also need some refactoring later.
I hate to be so negative, but one of the biggest problems junior engineers face is that they don't know how to make sense of or prioritize the gluttony of new-to-them information to make decisions. It's not helpful to have an AI reduce the search space because they still can't narrow down the last step effectively (or possibly independently).
There are junior engineers who seem to inherently have this skill. They might still be poor in finding all necessary information, but when they do, they can make the final, critical decision. Now, with AI, they've largely eliminated the search problem so they can focus more on the decision making.
The problem is it's extremely hard to identify who is what type. It's also something that senior level devs have generally figured out.
I've learned a lot of shit while getting AI to give me the answers, because I wanted to understand why it did what it did. It saves me a lot of time trying to fix things that would have never worked, so I can just spend time analyzing success.
There might be value in learning from failure, but my guess is that there's more value in learning from success, and if the LLM doesn't need me to succeed my time is better spent pushing into territory where it fails so I can add real value.
I kind of agree here. The mental model that works for me is "search results passed through a rock tumbler". Search results without attribution and mixed-and-matched across reputable and non-reputable sources, with a bias toward whatever source type is more common.
Don’t confuse this with this persons ability to hide their instincts. He is redefining “senior” roles as junior, but words are meaningless in a world of numbers. The $$$ translation is that something that was worth $2 should now be worth $1.
first response from me "let me mention how the real business world actually works" .. let's add a more nuanced slice to that however
Since desktop computers became popular, there have been thousands of small to mid-size companies that could benefit from software systems.. A thousand thousand "consultants" marched off to their nearest accountant, retailer, small manufacturer or attorney office, to show off the new desktop software and claim ability to make new, custom solutions.
We know now, this did not work out for a lot of small to mid-size business and/or consultants. Few could build a custom database application that is "good enough" .. not for lack of trying.. but pace of platforms, competitive features, stupid attention getting features.. all of that, outpaced small consultants .. the result is giant consolidation of basic Office software, not thousands of small systems custom built for small companies.
What now, in 2025? "junior" devs do what? design and build? no. Cookie-cutter procedures at AWS lock-in services far, far outpace small and interesting designs of software.. Automation of AWS actions is going to be very much in demand.. is that a "junior dev" ? or what?
This is a niche insight and not claiming to be the whole story.. but.. ps- insert your own story with "phones" instead of desktop software for another angle
One thing I'd point out is that there are only so many ways to write a document or build a spreadsheet. There are a ton of business processes that are custom enough to that org that they have to decide to go custom, change their process, or deal with the inefficiency of not having a technical solution that accomplishes the goal easily.
Lotus Notes is an example of that custom software niche that took off and spawned a successful consulting ecosystem around it too.
certainly no -- not "all software" of anything. Where is the word "enterprise" in the post you have replied to ? "enterprise" means the very largest companies and institutions..
I did not write "all software" or "enterprise software" but you are surprised I said that... hmmm
The dude literally invented Extreme Programming and was the first signer of the Agile Manifesto. He's forgotten more about software development than most people on this site ever knew.
To be fair, even if I appreciate Beck, some people do get too famous and start to inhabit a space that is far removed from the average company. Many of these guys tend to give out advice that is applicable to a handful of top earning companies but not the rest.
Interesting take... I'm seeing a pattern... People think AI can do it all... BUT I see juniors often are the ones who actually understand AI tools better than seniors... That's what AWS CEO points out... He said juniors are usually the most experienced with AI tools, so cutting them makes no sense... He also mentioned they are usually the least expensive, so there's little cost saving... AND he warned that without a talent pipeline you break the future of your org... As someone who mentors juniors, I've seen them use AI to accelerate their learning... They ask the right questions, iterate quickly, and share what they find with the rest of us... Seniors rely on old workflows and sometimes struggle to adopt new tools... HOWEVER the AI isn't writing your culture or understanding your product context... You still need people who grow into that... So I'm not worried about AI replacing juniors... I'm more worried about companies killing their own future talent pipeline... Let the genies help, but don't throw away your apprentices.
Their implication is that junior devs have more likely built up their workflow around the use of AI tooling, likely because if they're younger they'll have had more plasticity in their process to adapt AI tooling
Overall I don't quite agree. Personally this applies to me, I've been using vim for the last decade so any AI tooling that wants me to run some electron app is a non starter. But many of my senior peers coming from VS Code have no such barriers
Speaking of vim - adding and configuring copilot plugin for vim is easy (it runs a nodejs app in the background but if you have spare 500 Mb RAM it's invisible).
ON TOP OF IT ALL, juniors are the ones who bring novel tools to the desk MOST times...i.e. I had no clue the Google IDE gave you free unlimited credits despite the terrible UI...but a young engineer told me about it!!
I've seen seniors and juniors bring novel tools in. Seniors do it less often perhaps - but only because we have seen this before under a different name and realize it isn't novel. (sometimes the time has finally come, sometimes it fail again for the same reason it failed last time)
I'm just shocked people aren't clueing into the fact that tech companies are trying to build developer dependence on these things to secure a "rent" revenue stream. But hey, what do I know. It's just cloud hyper scaling all over again. Don't buy and drive your own hardware. Rent ours! Look, we built the metering and everything!
I'd hope people are. It's painfully obvious this entire AI push is rent-seeking half hidden by big tech money. At some point the free money is going to go away, but the rent for every service will remain.
Amazon has an internal platform for building software. The workflows are documented and have checks and balances. So the CEO wants to have more junior developer that are more proficient with AI, and have (in ratio) less senior developers. Also, product context comes from (product) managers, UX designers.
For medium or small companies, these guardrails or documentation can be missing. In that case you need experienced people to help out.
Nah, models can be fine tuned and trained on anything. Common consumer products like ChatGPT and Gemini have particular styles, very polite and helpful, but there are models trained to be combatative, models trained to write in the style of shakespeare, all sorts of things. Someone could train a model to reply to posts in the style of HN comments and you’d probably never know.
you're right but my opinion about this has changed
I would have agreed with you 100% one year ago. Basically senior engineers are too complacent to look at AI tools as well as ego driven about it, all while corporate policy disincentivizes them from using anything at all, with maybe a forced Co-Pilot subscription. While junior engineers will take a risk that the corporate monitoring of cloud AI tools isn't that robust.
But now, although many of those organizations are still the same - with more contrived Co-Pilot subscriptions - I think senior engineers are skirting corporate policy too and become more familiar with tools.
I'm also currently in an organization that is a total free for all with as many AI coding and usage tools as necessary to deliver faster. So I could be out of touch already.
Perhaps more complacent firms are the same as they were a year ago.
Maybe, but you make it sound like juniors are more worthy to companies than seniors. Then fire most/all seniors and good luck with resulting situation.
Coding in any sufficiently large organization is never the main part of senior's time spend, unless its some code sweatshop. Juniors can do little to no of all that remaining glue that makes projects go from a quick brainstorming meeting to live well functioning and supported product.
So as for worth - companies can, in non-idedal fashion obviously, work without juniors. I can't imagine them working without seniors, unless its some sort of quick churn of CRUDs or eshops from some templates.
Also there is this little topic that resonates recently across various research - knowledge gained fast via llms is a shallow one, doesn't last that long and doesn't go deeper. One example out of many - any time I had to do some more sophisticated regex-based processing I dived deep into specs, implementation etc. and few times pushed it to the limits (or realized task is beyond what regex can do), instead of just given the result, copypasted it and moved along since some basic test succeeded. Spread this approach across many other complex topics. That's also a view on long term future of companies.
I get what you say and I agree partially but its a double edged sword.
I think the temptation to use AI is so strong that it will be those who will keep learning who will be valuable in future. Maybe by asking AI to explain/teach instead of asking for solution direclty. Or not using AI at all.
I think seniors know enough to tell whether they need to learn or not. At least that's what I tell myself!
The thing with juniors is: those who are interested in how stuff works now have tools to help them learn in ways we never did.
And then it's the same as before: some hires will care and improve, others won't. I'm sure that many juniors will be happy to just churn out slop, but the stars will be motivated on their own to build deeper understanding.
I can't help but feel this is backpedaling after the AI hype led to people entering university avoiding computer science or those already in changing their major. Ultimately we might end up with a shortage of developers again, which would be amusing.
I went to university 2005-2008 and I was advised by many people at the time to not go into computer science. The reasoning was that outsourced software developers in low-cost regions like India and SEA would destroy salaries, and software developers should not expect to make more than $50k/year due to the competition.
Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.
This might be the professional/career version of "buy when there's blood in the streets."
I went for CS in my late 20s, always tinkered with computers but didn't get into programming earlier. College advisor told me the same thing, and that he went for CS and it was worthless. This was 2012.
I had a job lined up before graduating. Now make high salary for the area, work remotely 98% of the time and have flexible schedule. I'm so glad I didn't listen to that guy.
The one thing I learned in college is that the advisors are worthless. There's how many students? And you are supposed to expect they know the best thing for you? My advisor told me that all incoming freshmen must take a specific math class, a pre-calculus course, totally ignoring all of my AP exams that showed I was well beyond that. Wasted my time and money.
My take is that these are not binary issues. With outsourcing, it is true that you can hire someone cheaper in Asian countries but it cannot kill all jobs locally. So what happens is that the absolute average/mediocre get replaced by outsourcing and now with AI while the top talent can still command a good salary because they are worth it.
So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.
> Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.
At the end of the day, radiologists are still doctors.
Yup hearing big talk about competition and doom is a strong signal that there is plenty of demand.
You can either bet on the new unproven thing claiming to change things overnight, or just do the existing thing that's working right now. Even if the new thing succeeds, an overnight success is even more unrealistic. The insight you gain in the meantime is valuable for you to take advantage of what that change brings. You win either way.
When there is no competition that is a sign there is no demand.
There can sometimes be too much competition, but often there is only the illusion of too much if you don't look at quality. You can find a lot of cheap engineers in India, but if you want a good quality product you will have to pay a lot more.
Can anyone really blame the students? If I were in their shoes, I probably wouldn't bother studying CS right now. From their perspective, it doesn't really matter whether AI is bullshit in any capacity; it matters whether businesses who are buying the AI hype are going to hire you or not.
Hell, I should probably be studying how to be a carpenter given the level at which companies are pushing vibe coding on their engineers.
"after the AI hype led to people entering university avoiding computer science or those already in changing their major"
That's such a terrible trend.
Reminds me of my peers back in ~2001 who opted not to take a computer science degree even though they loved programming because they thought all the software engineering jobs would be outsourced to countries like India and there wouldn't be any career opportunities for them. A very expensive mistake!
Certainly, I even know of experienced devs switching out of tech entirely. I think the next couple of decades are going to be very good for software engineers. There will be an explosion of demand yet a contraction in supply. We are in 2010 again.
There will be programmers of the old ways, but AI is basically code 2.0, there are now a lot of things that are AI specific that those with traditional software development skills can’t do.
Or maybe they realize the AI needs humans in the loop for the foreseeable future for enterprise use cases and juniors (and people from LCL areas) are cheaper and make the economics make some sort of sense.
It's backpedaling but I don't think it's planning ahead to prevent a developer shortage - rather it's pandering to the market's increasing skepticism around AI and that ultimately the promised moonshot of AI obsoleting all knowledge work didn't actually arrive (at least not in the near future).
It's similar to all those people who were hyping up blockchain/crypto/NFTs/web3 as the future, and now that it all came to pass they adapted to the next grift (currently it's AI). He is now toning down his messaging in preparation of a cooldown of the AI hype to appear rational and relevant to whatever comes next.
You are right, perfect amount of false humility and balance. The wage suppression is an accidental biproduct and not the intent. Collateral damage if you will.
So he's saying we should be replacing the seniors with fresh grads who are good at using AI tools? Not a surprising take, given Amazon's turnover rate.
My experience is that juniors have an easier time to ramp up, but never get better at proper engineering (analysis) and development processes (debug). They also struggle to read and review code.
I fear that unless you heavily invest in them and follow them, they might be condemned to have decades of junior experience.
In my view there's two parts to learning, creation and taste, and both need to be balanced to make progress. Creation is, in essence, the process of forming pathways that enable you to do things, developing taste is the process of pruning and refining pathways to doing things better.
You can't become a chef without cooking, and you can't become a great one without cultivating a taste (pun intended) for what works and what it means for something to be good.
From interactions with our interns and new-grads, they lack the taste, and rely too much on the AI for generation. The consequence is that when you have conversations with them, they straggle to understand the concepts and tools they are using because they lack the familiarity that comes with creation, and they lack the skills to refine the produced code into something good.
I gave opus an "incorrect" research task (using this slash command[1]) in my REST server to research to use SQLite + Litestream VFS can be used to create read-replicas for REST service itself. This is obviously a dangerous use of VFS[2] and a system like sqlite in general(stale reads and isolation wise speaking). Ofc it happily went ahead and used Django's DB router feature to implement `allow_relation` to return true if `obj._state.db` was a `replica` or `default` master db.
Now claude had access to this[2] link and it got the daya in the research prompt using web-searcher. But that's not the point. Any Junior worth their salt — distributed systems 101 — would know _what_ was obvious, failure to pay attention to the _right_ thing. While there are ideas on prompt optimization out there [3][4], the issue is how many tokens can it burn to think about these things and come up with optimal prompt and corrections to it is a very hard problem to solve.
I believe the idea is to not stop hiring juniors. Instead it's to replace anybody that commands a high salary with a team of cheaper juniors armed with LLM's. The idea is more about dragging down average pay than never hiring anybody. At least for now.
But I think the actual reason was not addressed. The work of junior devs is exactly what can be replaced by AI, instead of the more complex abilities senior development possess.
We frequently get juniors or interns who are perfectly capable of pumping out many LoC with the use of AI in various forms - the issue is that they _don't_ actually ever learn how to think for themselves, and can't fix problems when something goes wrong or the LLM paints itself into a corner. I have found myself doing a lot more shepherding and pairing with juniors when they can't figure something out recently, because they just have not had the space to build their own skills.
Ive been managing and supporting teams for a long time and i'm sorry, but junior and mid-level devs do the majority of the heavy lifting when it comes to work output in big corps. I don't think AI will replace them. I don't think all these IC5 and IC6 engineers are going to be putting up 400-500 diffs a year anytime soon.
The level of cynicism here is astronomical. After discovering the strategy of "fire juniors and let a few seniors manage autonomous agents" was an abject failure, now the line is "actually juniors are great because we've brainwashed them into thinking AI is cool and we don't have to pay them so much". Which makes me want to vomit.
The only relevant point here is keeping a talent pipeline going, because well duh. That it even needs to be said like it's some sort of clever revelation is just another indication of the level of stupid our industry is grappling with.
I have heard this thing quite a few times over last few months each time is Amazon or AWS CEOs. May be this time he want to replace senior engineers. That would be more useful for them as each passing year they more and more of them and in times like these they are not looking to go leave Amazon on their own.
I recently pair-worked with two junior developers (on their first job, but still with like 2+ years with the company) in order to transfer the know-how of something.
I realized that they are shockingly bad at most basic things. Still their PR:s look really good (on the surface). I assume they use AI to write most of the code.
What they do excel in is a) cultural fit for the company and b) providing long-term context to the AIs for what needs to be done. They are essentially human filters between product/customers and the AI. They QA the AI models' output (to some extent).
These people are working on destroying the planet to make more money, they absolutely do not care. Our society isn't set up to punish them, but encourage such behavior to even more extremes (see datacenter build outs causing water shortages, electricity hikes, and cancer in poor communities; nearly every politician capitulating on such actions because they don't know better).
I wish people would get off the "AI is the worst thing for the environment" bandwagon. AI and data centers as a whole aren't even in the top 100 emitters of pollution and never will be.
If you want to complain about tech companies ruining the environment, look towards policies that force people to come into the office. Pointless commutes are far, far worse for the environment than all data centers combined.
Complaining about the environmental impact of AI is like plastic manufacturers putting recycling labels on plastic that is inherently not recycleable and making it seem like plastic pollution is every day people's fault for not recycling enough.
AI's impact on the environment is so tiny it's comparable to a rounding error when held up against the output of say, global shipping or air travel.
Why don't people get this upset at airport expansions? They're vastly worse.
Of course they aren't polluters as in generating some kind of smoke themselves. But they do consume megawatts upon megawatts of power that has to be generated somewhere. Not often you have the luxury of building near nuclear power plant. And in the end you're still releasing those megawatts as heat into the atmosphere.
Obviously after 10-15 years of experience working as a developer AI will be a senior dev. Probably will get promoted to management with all that experience.
Promoting your best engineers to management sometimes gets you a great manager, but often gets you a mediocre or just-about-competent manager at the cost of a great engineer.
I'm a big fan of the "staff engineer" track as a way to avoid this problem. Your 10-15 year engineers who don't vibe with management should be able to continue earning managerial salaries and having the biggest impact possible.
I'm also a fan of leadership without management. Those experienced engineers should absolutely be taking on leadership responsibilities - helping guide the organization, helping coach others, helping build better processes. But they shouldn't be stuck in management tasks like running 1-1s and looking after direct reports and spending a month every year on the annual review process.
This is a general problem that corporations have trouble with with: The struggle to separate leadership and people management. Why does the person who tells you what to do also need to be the same person who does your annual review, who also has to be the same person who leads the technical design of the project, approves your vacation, assists with your career development, and gives feedback or disciplinary correction when you mess up? Why do we always seem to bundle all these distinct roles together under "Manager"?
Absolutely agree. Regardless, my org keeps trying to get me to take a management role after 15 years dev experience. I love my job and don't like managing people. You couldn't pay me enough to become a manager.
This is exactly where I find myself. I've been asked several times to take on management, but I have no interest in it. I got to be a principal after 18 years of experience by being good at engineering, not management. Like you said, I can and do help with leadership through mentorship, offering guidance and advice, giving presentations on technical topics, and leading technical projects.
To me the more insidious problem is that we have juniors now that aren’t learning much because they lean on AI for everything. They are behind the curve.
Most of the apps that I use regularly fail at least once a day nowadays. I think this is a direct cause of putting AI code in production without reviewing/QA.
While I have no particular love for AI generated code, I think this has nothing to do with AI. Software has been unreliable for over a decade. Companies have been rushing out half baked products and performing continual patches for many years. And it's our fault because we have just come to accept it.
The "over" deserves a lot of emphasis. To this day, I save my code at least once per line that I type because of the daily (sometimes hourly) full machine crashes I experienced in the 80s and 90s.
The problem is human, not technical. Companies and managers need to start caring about the details instead of crossing items off a list. Until we see that culture shift in the industry, which might never happen, AI isn't going to help—if anything, it'll make the problem worse as devs rush to deliver on arbitrary deadlines.
Plus, if you are skipping tests or telling yourself, you wrote them when they don’t actually verify anything in the first place, then buying into a hype cycle of “the AI writes perfect code“ is unlikely to break the pattern
Now with AI, I expect junior developers to learn much quicker and progress to senior very quickly. I'd now rather hire at least 1 of each to begin with, both "junior" and a "senior" developer and then additionally hire more juniors to quickly turn them into a "senior".
We do not need to hire anymore outside senior developers who need to be trained on the codebase with AI, given that the junior developers catch up so quickly they already replaced the need to hire a senior developer.
Therefore replacing them with AI agents was quite premature if not completely silly. In fact it makes more sense to hire far less senior developers and to instead turn juniors directly into senior developers to save lots of money and time to onboard.
This is performative bullshit pandering to the increased skepticism around AI. He wouldn't be saying that if AI investment was still in full swing.
I do agree with him about AI being a boon to juniors and pragmatic usage of AI is an improvement in productivity, but that's not news, it's been obvious since the very beginnings of LLMs.
So it's performative when the head of AWS says it and not news. But it's not performative when you say it and people should have listened to you in the comments?
It's performative when you talk whatever the market wants to hear rather than sticking to an opinion (no matter how flawed it is). This behavior reminds me of the cryptobros that were hailing NFTs/web3 as the next best thing since sliced bread, and when that didn't came to pass quietly moved onto the next grift (AI) with the same playbook.
(also I’m just talking out of my ass on a tech forum under a pseudonym instead of going to well-publicized interviews)
Am I missing some irony or sarcasm here? Aren't internships meant to spend some time teaching people the ropes in return for free hands? This sounds like a weird Jack Welch circlejerk.
This sounds like a comment from someone who doesn't have visibility into how good the models are getting and how close they are to fully autonomous, production-grade software development.
This is an easy theory to prove; if AI was anywhere close to a senior engineer, we'd see the costs of software development drop by a corresponding amount or quality would be going up. Not to mention delivery would become faster. With LLMs being accessible to the general public I'd also expect to see this in the open-source world.
I see none of that happening - software quality is actually in freefall (but AI is not to blame here, this began even before the LLM era), delivery doesn't seem to be any faster (not a surprise - writing code has basically never been the bottleneck and the push to shove AI everywhere probably slows down delivery across the board) nor cheaper (all the money spent on misguided AI initiatives actually costs more).
It is a super easy bet to take with money - software development is still a big industry and if you legitimately believe AI will do 90% of a senior engineer you can start a consultancy, undercut everyone else and pocket the difference. I haven’t heard of any long-term success stories with this approach so far.
This sounds like a comment from someone who has tested it in a limited capacity such as small blog sites or side projects that did not need to be maintained
> The juniors working this way compress their ramp dramatically. Tasks that used to take days take hours. Not because the AI does the work, but because the AI collapses the search space. Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced. The time freed this way isn’t invested in another unprofitable feature, though, it’s invested in learning. [...]
> If you’re an engineering manager thinking about hiring: The junior bet has gotten better. Not because juniors have changed, but because the genie, used well, accelerates learning.
I would argue a machine that short circuits the process of getting stuck in obtuse documentation is actually harmful long term...
The arguments were similar, too: What will you do if Google goes down? What if Google gives the wrong answer? What if you become dependent on Google? Yet I'm willing to bet that everyone reading this uses search engines as a tool to find what they need quickly on a daily basis.
I would argue a machine that short-circuits the process of getting stuck in obtuse books is actually harmful long term...
Conversations like this are always well intentioned and friction truly is super useful to learning. But the ‘…’ in these conversations seems to always be implicating that we should inject friction.
There’s no need. I have peers who aren’t interested in learning at all. Adding friction to their process doesn’t force them to learn. Meanwhile adding friction to the process of my buddies who are avidly researching just sucks.
If your junior isn’t learning it likely has more to do with them just not being interested (which, hey, I get it) than some flaw in your process.
Start asking prospective hires what their favorite books are. It’s the easiest way to find folks who care.
AI, on the other hand...
If you read great books all the time, you will find yourself more skilled at identifying good versus bad writing.
If you can just get to the answer immediately, what’s the value of the struggle?
Research isn’t time coding. So it’s not making the developer less familiar with the code base she’s responsible for. Which is the usual worry with AI.
1995: struggling with docs and learning how and where to find the answers part of the learning process
2005: struggling with stackoverflow and learning how to find answers to questions that others have asked before quickly is part of the learning process
2015: using search to find answers is part of the learning process
2025: using AI to get answers is part of the learning process
...
https://en.wikipedia.org/wiki/Mastery_learning
Any task has “core difficulty” and “incidental difficulty”. Struggling with docs is incidental difficulty, it’s a tax on energy and focus.
Your argument is an argument against the use of Google or StackOverflow.
Complaining about docs is like complaining about why research article is not written like elementary school textbooks.
Also the difference between using it to find information versus delegating executive-function.
I'm afraid there will be a portion of workers who crutch heavily on "Now what do I do next, Robot Soulmate?"
But for many Jr engineers it’s the hard part. They are not (yet) expected to be responsible for the larger issues.
But these are the things people learn through experience and exposure, and I still think AI can help by at least condensing the numerous books out there around technology leadership into some useful summaries.
There is such a thing as software engineering skill and it is not domain knowledge, nor knowledge of a specific codebase. It is good taste, an abstract ability to create/identify good solutions to a difficult problem.
Good luck maintaining that.
I hate to be so negative, but one of the biggest problems junior engineers face is that they don't know how to make sense of or prioritize the gluttony of new-to-them information to make decisions. It's not helpful to have an AI reduce the search space because they still can't narrow down the last step effectively (or possibly independently).
There are junior engineers who seem to inherently have this skill. They might still be poor in finding all necessary information, but when they do, they can make the final, critical decision. Now, with AI, they've largely eliminated the search problem so they can focus more on the decision making.
The problem is it's extremely hard to identify who is what type. It's also something that senior level devs have generally figured out.
This is "the kids will use the AI to learn and understand" level of cope
no, the kids will copy and paste the solution then go back to their preferred dopamine dispenser
There might be value in learning from failure, but my guess is that there's more value in learning from success, and if the LLM doesn't need me to succeed my time is better spent pushing into territory where it fails so I can add real value.
Because that makes the most business sense.
Since desktop computers became popular, there have been thousands of small to mid-size companies that could benefit from software systems.. A thousand thousand "consultants" marched off to their nearest accountant, retailer, small manufacturer or attorney office, to show off the new desktop software and claim ability to make new, custom solutions.
We know now, this did not work out for a lot of small to mid-size business and/or consultants. Few could build a custom database application that is "good enough" .. not for lack of trying.. but pace of platforms, competitive features, stupid attention getting features.. all of that, outpaced small consultants .. the result is giant consolidation of basic Office software, not thousands of small systems custom built for small companies.
What now, in 2025? "junior" devs do what? design and build? no. Cookie-cutter procedures at AWS lock-in services far, far outpace small and interesting designs of software.. Automation of AWS actions is going to be very much in demand.. is that a "junior dev" ? or what?
This is a niche insight and not claiming to be the whole story.. but.. ps- insert your own story with "phones" instead of desktop software for another angle
Lotus Notes is an example of that custom software niche that took off and spawned a successful consulting ecosystem around it too.
TIL Notes is still a thing. I had thought it was dead and gone some time ago.
I did not write "all software" or "enterprise software" but you are surprised I said that... hmmm
https://substack.com/@kentbeck
What software projects is he actively working on?
In many cases he helped build the bandwagons you're implying he simply jumped onto.
The fact that I cannot tell if you mean this satirically or not (though I want to believe you do!) is alarming to me.
Sorry, what does that mean exactly ? Are you claiming that a junior dev knows how to ask the right prompts better than a Senior dev ?
Overall I don't quite agree. Personally this applies to me, I've been using vim for the last decade so any AI tooling that wants me to run some electron app is a non starter. But many of my senior peers coming from VS Code have no such barriers
Me too. Fire your senior devs. (Ha ha, not ha ha.)
Cannot wait for the 'Oh dear god everything is on fire, where is the senior dev?' return pay packages.
For medium or small companies, these guardrails or documentation can be missing. In that case you need experienced people to help out.
"bespoke, hand generated content straight to your best readers"
I would have agreed with you 100% one year ago. Basically senior engineers are too complacent to look at AI tools as well as ego driven about it, all while corporate policy disincentivizes them from using anything at all, with maybe a forced Co-Pilot subscription. While junior engineers will take a risk that the corporate monitoring of cloud AI tools isn't that robust.
But now, although many of those organizations are still the same - with more contrived Co-Pilot subscriptions - I think senior engineers are skirting corporate policy too and become more familiar with tools.
I'm also currently in an organization that is a total free for all with as many AI coding and usage tools as necessary to deliver faster. So I could be out of touch already.
Perhaps more complacent firms are the same as they were a year ago.
Coding in any sufficiently large organization is never the main part of senior's time spend, unless its some code sweatshop. Juniors can do little to no of all that remaining glue that makes projects go from a quick brainstorming meeting to live well functioning and supported product.
So as for worth - companies can, in non-idedal fashion obviously, work without juniors. I can't imagine them working without seniors, unless its some sort of quick churn of CRUDs or eshops from some templates.
Also there is this little topic that resonates recently across various research - knowledge gained fast via llms is a shallow one, doesn't last that long and doesn't go deeper. One example out of many - any time I had to do some more sophisticated regex-based processing I dived deep into specs, implementation etc. and few times pushed it to the limits (or realized task is beyond what regex can do), instead of just given the result, copypasted it and moved along since some basic test succeeded. Spread this approach across many other complex topics. That's also a view on long term future of companies.
I get what you say and I agree partially but its a double edged sword.
But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm.
I'm pretty sure AI is going to lead us to a deskilling crash.
Food for thought.
The thing with juniors is: those who are interested in how stuff works now have tools to help them learn in ways we never did.
And then it's the same as before: some hires will care and improve, others won't. I'm sure that many juniors will be happy to just churn out slop, but the stars will be motivated on their own to build deeper understanding.
Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.
This might be the professional/career version of "buy when there's blood in the streets."
I had a job lined up before graduating. Now make high salary for the area, work remotely 98% of the time and have flexible schedule. I'm so glad I didn't listen to that guy.
So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.
At the end of the day, radiologists are still doctors.
You can either bet on the new unproven thing claiming to change things overnight, or just do the existing thing that's working right now. Even if the new thing succeeds, an overnight success is even more unrealistic. The insight you gain in the meantime is valuable for you to take advantage of what that change brings. You win either way.
There can sometimes be too much competition, but often there is only the illusion of too much if you don't look at quality. You can find a lot of cheap engineers in India, but if you want a good quality product you will have to pay a lot more.
Hell, I should probably be studying how to be a carpenter given the level at which companies are pushing vibe coding on their engineers.
That's such a terrible trend.
Reminds me of my peers back in ~2001 who opted not to take a computer science degree even though they loved programming because they thought all the software engineering jobs would be outsourced to countries like India and there wouldn't be any career opportunities for them. A very expensive mistake!
It's similar to all those people who were hyping up blockchain/crypto/NFTs/web3 as the future, and now that it all came to pass they adapted to the next grift (currently it's AI). He is now toning down his messaging in preparation of a cooldown of the AI hype to appear rational and relevant to whatever comes next.
Pointing out that it wasn’t always that will make you seem “negative.”
Considering the talk around junior devs lately on HN, there's way too many of them, it would indeed be amusing.
To what?
I fear that unless you heavily invest in them and follow them, they might be condemned to have decades of junior experience.
You can describe pre-ai developers and like that too. It's probably my biggest complaint about some of my Co workers
In my view there's two parts to learning, creation and taste, and both need to be balanced to make progress. Creation is, in essence, the process of forming pathways that enable you to do things, developing taste is the process of pruning and refining pathways to doing things better.
You can't become a chef without cooking, and you can't become a great one without cultivating a taste (pun intended) for what works and what it means for something to be good.
From interactions with our interns and new-grads, they lack the taste, and rely too much on the AI for generation. The consequence is that when you have conversations with them, they straggle to understand the concepts and tools they are using because they lack the familiarity that comes with creation, and they lack the skills to refine the produced code into something good.
Now claude had access to this[2] link and it got the daya in the research prompt using web-searcher. But that's not the point. Any Junior worth their salt — distributed systems 101 — would know _what_ was obvious, failure to pay attention to the _right_ thing. While there are ideas on prompt optimization out there [3][4], the issue is how many tokens can it burn to think about these things and come up with optimal prompt and corrections to it is a very hard problem to solve.
[1] https://github.com/humanlayer/humanlayer/blob/main/.claude/c... [2] https://litestream.io/guides/vfs/#when-to-use-the-vfs [3] https://docs.boundaryml.com/guide/baml-advanced/prompt-optim... [4]https://github.com/gepa-ai/gepa
"Amazon announces $35 billion investment in India by 2030 to advance AI innovation, create jobs" https://www.aboutamazon.com/news/company-news/amazon-35-bill... (Dec 9 2025)
The only relevant point here is keeping a talent pipeline going, because well duh. That it even needs to be said like it's some sort of clever revelation is just another indication of the level of stupid our industry is grappling with.
The. Bubble. Cannot. Burst. Soon. Enough!!
I realized that they are shockingly bad at most basic things. Still their PR:s look really good (on the surface). I assume they use AI to write most of the code.
What they do excel in is a) cultural fit for the company and b) providing long-term context to the AIs for what needs to be done. They are essentially human filters between product/customers and the AI. They QA the AI models' output (to some extent).
If you want to complain about tech companies ruining the environment, look towards policies that force people to come into the office. Pointless commutes are far, far worse for the environment than all data centers combined.
Complaining about the environmental impact of AI is like plastic manufacturers putting recycling labels on plastic that is inherently not recycleable and making it seem like plastic pollution is every day people's fault for not recycling enough.
AI's impact on the environment is so tiny it's comparable to a rounding error when held up against the output of say, global shipping or air travel.
Why don't people get this upset at airport expansions? They're vastly worse.
We do too, don't worry.
I'm a big fan of the "staff engineer" track as a way to avoid this problem. Your 10-15 year engineers who don't vibe with management should be able to continue earning managerial salaries and having the biggest impact possible.
https://staffeng.com/about/
I'm also a fan of leadership without management. Those experienced engineers should absolutely be taking on leadership responsibilities - helping guide the organization, helping coach others, helping build better processes. But they shouldn't be stuck in management tasks like running 1-1s and looking after direct reports and spending a month every year on the annual review process.
https://news.ycombinator.com/item?id=44972151
Does this story add anything new?
The "over" deserves a lot of emphasis. To this day, I save my code at least once per line that I type because of the daily (sometimes hourly) full machine crashes I experienced in the 80s and 90s.
We do not need to hire anymore outside senior developers who need to be trained on the codebase with AI, given that the junior developers catch up so quickly they already replaced the need to hire a senior developer.
Therefore replacing them with AI agents was quite premature if not completely silly. In fact it makes more sense to hire far less senior developers and to instead turn juniors directly into senior developers to save lots of money and time to onboard.
Problem solved.
I do agree with him about AI being a boon to juniors and pragmatic usage of AI is an improvement in productivity, but that's not news, it's been obvious since the very beginnings of LLMs.
(also I’m just talking out of my ass on a tech forum under a pseudonym instead of going to well-publicized interviews)
Week 2: 32 interns
Week 3: 16 interns
Week 4: 8 interns
Week 5: 4 interns
Week 6: 2 interns
Week 7: 1 intern
Week 8: 0.5 interns
Is it possible to make it to the end of the summer without getting sliced in half?
What the hell.
Consider making them fight each other in an arena, you could monetize that.
I see none of that happening - software quality is actually in freefall (but AI is not to blame here, this began even before the LLM era), delivery doesn't seem to be any faster (not a surprise - writing code has basically never been the bottleneck and the push to shove AI everywhere probably slows down delivery across the board) nor cheaper (all the money spent on misguided AI initiatives actually costs more).
It is a super easy bet to take with money - software development is still a big industry and if you legitimately believe AI will do 90% of a senior engineer you can start a consultancy, undercut everyone else and pocket the difference. I haven’t heard of any long-term success stories with this approach so far.
I'm yet to see that production-grade code written by these production-grade models;