Over here in Germany, professors' job is "research and teaching". According to the internet, the author's university is a publicly funded university as well. I can see how AI can make you faster on the research side, but you give up 100% of the teaching/developing people part.
As a tax payer, I am very concerned if the people I fund with my taxes to do a job unilaterally declare they are no longer going to do the half of it.
Teaching an undergraduate class or even a graduate class is still teaching. The author does not say he won't do that anymore.
The problem is about the fresh talent pipeline for researchers (i.e. PhDs). In many ways, elementary school and a Master's degree are more alike than a Master's and a PhD in the sense that you're learning prior art with clearly defined exam/project assessments and no expectation of making something truly novel in both elementary school and the Master's, while a PhD is all about discovering something nobody uncovered before. So, calling this a problem of not wanting to teach isn't quite right.
IMO, the article is rather highlighting a different problem; the former problem in this area was that only a tiny sliver of the best engineering/CS undergrads wanted go into research given the far more lucrative industry careers, and now the supply part of that market is about to vanish too due to agentic AI. This will basically kill the concept of an academic career as we know it and the point of the article is that we need to find a different model of advancing and funding science.
Teaching and research should be decoupled. Professors are hired and granted tenure primarily based on their ability to produce original research. The skillsets are different; often good researchers are bad teachers, and good teachers are bad researchers.
There is a case to be made that teaching improves the understanding and insight of the teacher which in turn can increase their research ability. For starters, it provides a less boring way of drilling fundamentals. But more importantly, having to answer questions from students which very likely will be coming from odd and unexpected directions, helps the teacher clarify their thinking. It could well be that one of these odd questions, the answer for which the teacher takes for granted, may actually hold some insight or raise questions into what they are working on outside of class.
In a similar vein, it is recommended that if you are in a business meeting you hear what the junior positions have to say about something first and work your way up the chain of command rather than the other way around due to the junior positions being less familiar with internal processes and thus more likely to flag or suggest something completely out of left field that the higher ups might miss.
This used to be the case: research was conducted mostly at academic institutions that did not provide degrees [1]. The "research university" is a relatively new thing
That's not entirely the case in Germany. Applicants need to give a lecture which is public. Usually members of the student union will be present and will have a say later within the hiring committee about the quality of teaching.
But I do agree that the ability to produce and procure research is not at all coupled with the ability to teach.
Absolutely not. You could argue this for entry level lectures, but not at the PhD level. PhD is learning how to do original research, how could you separate teaching that from doing that?
You probably need to step outside of your US-centric bubble if you are to comment on how university works outside of the US. There was a fairly large clue in the parent comment.
"Often good researchers are bad teachers, and good teachers are bad researchers" is a statement about humans, not a specific country, as far as I can tell. Sure, I happened to use the word "tenure" which is generally used in a U.S. context but you should be able to take a charitable reading of what I said and understand the broader point.
To my knowledge the view is correct for places outside the US.
UK universities do currently hire people to do research and teach. And tenure is based on research not teaching. Teaching is seen as something that funds the operation to an extent. Some are excellent teachers. Some merely provide the material.
It works as is because researchers are not meaningfully impacted by having to do a few hours a week. And student get access to people in touch with the field. But it is not optimal having people who often are not good at teaching and/or don't particularly want to do it, taking lectures and tutorials.
Students are not only workers, they are also disciples of your work and, once forced to read it, will likely use it in the future even when they leave your lab.
Even completely egoistically replacing students with AI is shooting yourself in the foot in the long term.
It says a lot about US academic culture that they think in terms of hiring. There is an important educational commitment requirememt to the role of professor, at least in Europe. Hiring is to the betterment of your own goals and almost orthogonal to the educational mission. A lot of unethicalities fond their root in this schizophrenic mission statement of doing professional competitive scientific research and at the same time education of graduates.
This sounds misguided. In the little experience I had, I've seen that models get basic knowledge so absolutely wrong that giving them any sort of independence will not result in publications that positively impact a professor's reputation, or contribute to science. Or at least the reviews and papers I read that had AI content did not give me the impression that we should have more of this. And they require much more supervision, with the added issue that they cannot learn in the long term through your interactions, and without the enjoyment of teaching something to someone.
They're really good at finding papers though. Perhaps because navigating search engines has become a pain. Perhaps this will be the case in the future, but saying you're tempted right now is like saying you're being tempted to replace your HPC with quantum computers. It's a bit early.
Also 90% of citations generated by AI are wrong or straight up don’t even exist. It’s got such a long way to go to be able to reliably write credible papers.
Just like DEI, sustainability efforts, I predict we will see new initiatives for forced hiring of Juniors.
Implementation can differ (e.g. ratio of interns vs total headcount and so on), but it is the time for governments to intervene and force corporations to train people, humans are resource for the government, they need to polish that resource to thrive.
> Research grants are given by governments mainly to first TEACH students
Government's goal is obvious and correct, but if you have done a research and tried to get a grant you should know grants are very "political" as well, if you are researching a thing which is not trendy or takes another 10 years to yield results, but there is another lab who is telling we are researching LLM, it will be very difficult to get a grant even if you promise to TEACH/hire 20 students for that research.
Justifying long term benefits is difficult problem
As someone who both recoils at DEI and is at least a decade too old to benefit from a policy like this personally, I have to say this honestly sounds like a great idea.
Both avoids the tragedy of the commons (why would a corporation pay to train a junior when they can just let their competition do it then poach the experienced senior) and gives more opportunity to a new generation that are frankly getting economically screwed over enough as-is.
Yeah, could be, some problems I see with this implementation:
1. the wait time is too long for the company to fill a position, it is difficult to predict what happens in the next 4 years
2. difficult to match the students with companies. For example, you are interested in CS, but company wants specifically React developer (assuming there was no AI and there was still demand), would the student change all their courses based on the requirements and live like a robot who is forced to take courses they are not much interested in. Now imagine when gap is higher between topics (CS vs React is closer, compared to MBA vs procurement, both are somewhat subset of same topic)
The point is that we will still need senior level employees, but the way fresh grads get to that level is generally through entry level positions, experience and mentorship. I don't think we can expect the university system to start pumping out senior level graduates.
>In the process, they may bypass the valuable experience of struggling through early tasks and learning from their mistakes. Students, I worry, could simply become an intermediary between the raw idea and the AI’s output.
Even if all AI progress grinds to a permanent halt today, there's already enough utility in its current capability to force these questions. As a result, how we train and educate graduates and young people needs to change.
I have no doubt you need to have actual experience to be able to ensure AI output is at a production standard but if we accept that reality, then a shift in how we educate and train young people could make an enormous difference in ensuring employers still see value in hiring people with no real commercial work experience.
It's interesting that this dilemma (of getting quick and easy wins) is occurring at multiple levels. Even as a junior researcher, its often tempting to hand off actuall thinking and reasoning about one's research to AI (e.g. blindly accepting AI code) to quickly make 'progress'.
Apparently the same question is being asked at different levels and abstractions...
10 years from now, the people that stopped hiring novices and juniors are going to be deeply regretting their past decisions. The people that kept hiring are going to be working with their newly-promoted-to-senior colleagues and be making significantly more progress than those that didn’t keep hiring.
(IBM figured this out a couple of months ago, and explicitly announced tripling their hiring of juniors/grads in order to avoid ending up with a massive gap in the management/senior layers in future).
> “The companies three to five years from now that are going to be the most successful are those companies that doubled down on entry-level hiring in this environment,” Nickle LaMoreaux, IBM’s chief human resources officer, said this week.
Will they? Just because you developed them that doesn't guarantee they will stay with you. It's been always the same issue tbh, but big companies could accept the risk because they pay the most competitive salaries anyways.
Except they won't. They will just hire those new people away from the firms that trained them. That's what happens now and there's no reason why it won't happen in the future.
This is why firms that do actual training have clauses written in the employment contract that says if you receive x months of training from them then you have to work for them for at least y number of years otherwise if you leave then you have to pay them for the cost of training you (which is written as a dollar amount in the contract).
Companies that don't have that kind of clause in the contract are going to get screwed over when their newly trained employees get poached by other firms.
I started my career with a graduate program from a larger company. I stuck around in that company for close to 5 years and would have liked to stay longer. My reason for leaving were the absence of a career progression. The first 3 years, the company had a great career progression path. Clear outlines what it needs for a promotion, fair and transparent pay, etc.
That changed and despite hitting/exceeding my goals, I was denied a promotion twice with no good reason. My boss, who is fantastic, told me that he cannot give me a good reason because he himself did not receive one. So I left.
Generally speaking, my cohort of the program was part of the company much longer than most employees. I don't think a single person left in the first 3 years. Attrition only started now that there was a general shift in the companies culture and communication.
It honestly seems a little control freakish to think this way. People leave companies and that’s a good thing, they explore the industry and generally become more capable. If you leave on good terms there’s nothing holding back a renewed relationship, now with the added benefit of new perspectives; maybe meeting at conferences or working on a project. My gut is telling me these companies don’t part on good terms with their employees.
This is something that will have to be solved through the way research is funded.
At least for publicly funded work, it was always an assumption that you would need students to hit some goal; so by funding it you would get both the outcome, and more people skilled in that field. If the scope of what one team/senior can handle has grown with ai, we will either need explicit staff numbers as a requirement or bigger scope to the point where the ai can't handle it.
Or we find that AI can do so much the whole system implodes...
And here we see you’ve hit upon Jevon’s paradox. The scope of work will grow to use more than it did before, now that human labour achieves more for the same money. Employment will ultimately go up not down (over the long term - we are seeing a lot of short term instability and noise, although there’s much said about AI without it yet showing up in the data, as per articles recently shared on HN about employment figures across the US and the world).
Why not hire a graduate and empower them to use AI? Much better interfacing with an actual human who will then go and do the work using all AI tools at their disposal.
This is morally wrong and it should be embarrassing to publish such an article.
It doesn’t surprise me to see such articles coming from academia, in which juniors are treated like dirt to such an extreme that is unimaginable in any other industry, save for maybe Michelin star cuisine.
The title is not relevant to the article, not even for a single line. The author straightup assumes, does not answer the 'why', cause I was here to give Lady Lovelace argument to Turing, that you would NEVER (hire an ai instead of a student) unless you making directionless slop. You can share goals, but not the vision, and mission is different. Ai learns from experience, humans are needed to build that experience due to their extremely large 'context windows' going as deep as the constant evolution of the DNA(as long as it serves human-centric goals, which circles back to the mission part).
The article really is about "education seems directionless without economic goals", and again as comments have pointed out, it only seems so.
We measure scientific output as nr of publications. And that is the cause of bs like this.
These institutions have a duty to educate humanity. PhDs are also supposed to be able to help the public understand complicated science. To guide ethical decisions.
But no, we measure the number papers, and not even their quality (very well).
It's all a matter of incentive alignment, what gets measured gets done. The state of academic science is sad in most places. This contemplation by OP being case and point.
It's one thing to be forced to use the damn things, but this guy gives it very serious thought, much more than others I've seen and known, he even writes a science.org article about it, and ultimately chose wrong.
> I’m not sure where that will leave students who start with no research experience.
What is wrong with this guy? Of course he knows where that will leave those students. Why did he even choose to be in the business of developing people? Nobody forced him. Anyway, the ladders were pulled up in 2020–2021.
Reading the piece I _hope_ they are trying to make a point and not really thinking they are not going to help novices become juniors. But who knows, nowadays...
As a tax payer, I am very concerned if the people I fund with my taxes to do a job unilaterally declare they are no longer going to do the half of it.
The problem is about the fresh talent pipeline for researchers (i.e. PhDs). In many ways, elementary school and a Master's degree are more alike than a Master's and a PhD in the sense that you're learning prior art with clearly defined exam/project assessments and no expectation of making something truly novel in both elementary school and the Master's, while a PhD is all about discovering something nobody uncovered before. So, calling this a problem of not wanting to teach isn't quite right.
IMO, the article is rather highlighting a different problem; the former problem in this area was that only a tiny sliver of the best engineering/CS undergrads wanted go into research given the far more lucrative industry careers, and now the supply part of that market is about to vanish too due to agentic AI. This will basically kill the concept of an academic career as we know it and the point of the article is that we need to find a different model of advancing and funding science.
In a similar vein, it is recommended that if you are in a business meeting you hear what the junior positions have to say about something first and work your way up the chain of command rather than the other way around due to the junior positions being less familiar with internal processes and thus more likely to flag or suggest something completely out of left field that the higher ups might miss.
[1] https://asteriskmag.com/issues/10/the-origin-of-the-research...
But I do agree that the ability to produce and procure research is not at all coupled with the ability to teach.
UK universities do currently hire people to do research and teach. And tenure is based on research not teaching. Teaching is seen as something that funds the operation to an extent. Some are excellent teachers. Some merely provide the material.
It works as is because researchers are not meaningfully impacted by having to do a few hours a week. And student get access to people in touch with the field. But it is not optimal having people who often are not good at teaching and/or don't particularly want to do it, taking lectures and tutorials.
Even completely egoistically replacing students with AI is shooting yourself in the foot in the long term.
[Source: https://www.reddit.com/r/AskReddit/comments/o6hlry/statistic... ]
Implementation can differ (e.g. ratio of interns vs total headcount and so on), but it is the time for governments to intervene and force corporations to train people, humans are resource for the government, they need to polish that resource to thrive.
The professor's jobs are to TEACH students.
Research grants are given by governments mainly to first TEACH students and secondly to get something useful.
If they are not doing their job they should be fired.
That's not DEI or anything of the sort. That's common sense.
They can do their research at private companies if it's worth it.
Government's goal is obvious and correct, but if you have done a research and tried to get a grant you should know grants are very "political" as well, if you are researching a thing which is not trendy or takes another 10 years to yield results, but there is another lab who is telling we are researching LLM, it will be very difficult to get a grant even if you promise to TEACH/hire 20 students for that research.
Justifying long term benefits is difficult problem
Both avoids the tragedy of the commons (why would a corporation pay to train a junior when they can just let their competition do it then poach the experienced senior) and gives more opportunity to a new generation that are frankly getting economically screwed over enough as-is.
1. the wait time is too long for the company to fill a position, it is difficult to predict what happens in the next 4 years
2. difficult to match the students with companies. For example, you are interested in CS, but company wants specifically React developer (assuming there was no AI and there was still demand), would the student change all their courses based on the requirements and live like a robot who is forced to take courses they are not much interested in. Now imagine when gap is higher between topics (CS vs React is closer, compared to MBA vs procurement, both are somewhat subset of same topic)
Even if all AI progress grinds to a permanent halt today, there's already enough utility in its current capability to force these questions. As a result, how we train and educate graduates and young people needs to change.
I have no doubt you need to have actual experience to be able to ensure AI output is at a production standard but if we accept that reality, then a shift in how we educate and train young people could make an enormous difference in ensuring employers still see value in hiring people with no real commercial work experience.
Apparently the same question is being asked at different levels and abstractions...
You may be able to go fast with AI, but you can only go far with humans.
> “The companies three to five years from now that are going to be the most successful are those companies that doubled down on entry-level hiring in this environment,” Nickle LaMoreaux, IBM’s chief human resources officer, said this week.
This is why firms that do actual training have clauses written in the employment contract that says if you receive x months of training from them then you have to work for them for at least y number of years otherwise if you leave then you have to pay them for the cost of training you (which is written as a dollar amount in the contract).
Companies that don't have that kind of clause in the contract are going to get screwed over when their newly trained employees get poached by other firms.
I started my career with a graduate program from a larger company. I stuck around in that company for close to 5 years and would have liked to stay longer. My reason for leaving were the absence of a career progression. The first 3 years, the company had a great career progression path. Clear outlines what it needs for a promotion, fair and transparent pay, etc.
That changed and despite hitting/exceeding my goals, I was denied a promotion twice with no good reason. My boss, who is fantastic, told me that he cannot give me a good reason because he himself did not receive one. So I left.
Generally speaking, my cohort of the program was part of the company much longer than most employees. I don't think a single person left in the first 3 years. Attrition only started now that there was a general shift in the companies culture and communication.
The shortage of senior engineers will be even worse than it is today.
Not sure your argument really holds any water over a 10+ year period as I originally described.
At least for publicly funded work, it was always an assumption that you would need students to hit some goal; so by funding it you would get both the outcome, and more people skilled in that field. If the scope of what one team/senior can handle has grown with ai, we will either need explicit staff numbers as a requirement or bigger scope to the point where the ai can't handle it.
Or we find that AI can do so much the whole system implodes...
It doesn’t surprise me to see such articles coming from academia, in which juniors are treated like dirt to such an extreme that is unimaginable in any other industry, save for maybe Michelin star cuisine.
The article really is about "education seems directionless without economic goals", and again as comments have pointed out, it only seems so.
These institutions have a duty to educate humanity. PhDs are also supposed to be able to help the public understand complicated science. To guide ethical decisions.
But no, we measure the number papers, and not even their quality (very well).
It's all a matter of incentive alignment, what gets measured gets done. The state of academic science is sad in most places. This contemplation by OP being case and point.
What is wrong with this guy? Of course he knows where that will leave those students. Why did he even choose to be in the business of developing people? Nobody forced him. Anyway, the ladders were pulled up in 2020–2021.
The motivation to take on juniors to grow your long term capabilities equation is shifting, to the point where its harder to justify.