This is a great update! I hope the authors continue publishing new versions of their plots as the community builds up towards facility gain. It's hard to keep track of all the experiments going on around the world, and normalizing all the results into the same plot space (even wrt. just triple product / Lawson criteria) is actually tricky for various reasons and takes dedicated time.
Somewhat relevant, folks here might also be interested in a whitepaper we recently put up on arXiv that describes what we are doing at Pacific Fusion: https://arxiv.org/abs/2504.10680
Section 1 in particular gives some extra high-level context that might be useful to have while reading Sam and Scott's update, and the rest of the paper should also be a good introduction to the various subsystems that make up a high-yield fusion demonstration system (albeit focused on pulser-driven inertial fusion).
Anyone have any idea where First Light Fusion's third machine fits into this?
The idea of using literal guns (gunpowder, then light gas gun, then coil gun) to impact projectiles against each other seemed like it was probably ludicrous, but I haven't seen any critical media or numbers yet.
I heard that NIF was never intended to be a power plant, not even a prototype of one. It's primarily a nuclear weapon research program. For a power plant you would need much more efficient lasers, you would need a much larger gain in the capsules, you would need lasers that can do many shots per second, some automated reloading system for the capsules, and you would need a heat to electricity conversion system around the fusion spot (which will have an efficiency of ~1/3 or so).
It's an experimental facility. Yes, a power plant would need much more efficient lasers, but NIF's lasers date back to the 1990s, equivalent modern lasers are about 40X more efficient, and for an experiment it's easy enough to do a multiplication to see what the net result would have been with modern lasers.
Modern lasers can also repeat shots much more quickly. Power gain on the capsules appears to scale faster than linear with the input power, so getting to practical gain might not be as far off as it appears at first glance.
These are some of the reasons that various fusion startups are pursuing laser fusion for power plants.
From what I understood, laser fusion needs laser efficiencies not just 40x better than what NIF uses, but like 3 or 4 orders of magnitude more efficient than the state of the art. Seems like a non-starter.
I was trying to work out a joke about buying better lasers off of alibaba but it seems that despite being 30 years old they're still orders of magnitude beyond off the shelf options.
partially. The very efficient lasers from alibaba don't have short pulse/high power, so they can potentially be used only as the part of the system - the pumping lasers. The final nanosecond-laser is still a one-off build which though seems to be pretty doable even by a small company if they set their mind to it.
Btw, NIF achieved those recent results by adding strong magnetic field around the target (penny-shrinkers knew that tech for 20+ years :). There are other things like this around that can potentially be similarly useful. Only if somebody had money and interest ...
It’s fascinating how NIF's legacy tech limits its relevance for actual energy generation, yet it still serves as a stepping stone. The fact that gain scales faster than linearly with input power is particularly encouraging — it suggests that advances in laser efficiency and repetition rate could unlock meaningful progress sooner than many assume. I can see why startups are jumping on this now. Curious to see how much of this can move from lab to grid in the next decade.
>NIF is a key element of the National Nuclear Security Administration’s science-based Stockpile Stewardship Program to maintain the reliability, security, and safety of the U.S. nuclear deterrent without full-scale testing.
Nothing about the NIF looks like a power plant to me. It's like the laser weapons guy and the nuclear weapons guy found a way to spend giant piles of money without having to acknowledge the weapons angle.
A lot of people think so, but the US government openly spends way more money on nuclear weapons than on fusion research. We'll spend almost a trillion dollars on nuclear weapons over the next decade.[1] The government's fusion funding was only $1.4 billion for 2023.[2]
So it seems more likely to me that some physicists figured out how to get their fusion power research funded under the guise of weapons research, since that's where the money is. NIF's original intent was mostly weapons research but it's turned out to be really useful for both, and these days, various companies are attempting to commercialize the technology for power plants.[3]
Yes. The NIF is a weapons research lab, not a power research lab.
The purpose of it is to show that the USA is still capable of producing advanced hydrogen bombs. More advanced then anybody else.
The '2.05 megajoules' is only a estimation of the laser energy actually used to trigger the reaction. It ignores how much power it took to actually run the lasers or reactor. Even if they update the lasers with modern ones there is zero chance of it ever actually breaking even. It is a technological dead end as far as power generation goes.
The point of the 'breakthrough' is really more about ensuring continued Congressional approval for funding then anything else. They are being paid to impress and certainly they succeeded in that.
However I suspect this is true of almost all 'fusion breakthroughs'. They publish updates to ensure continued funding from their respective governments.
People will argue that this is a good thing since it helps ensure that scientists continue to be employed and publishing research papers. That sentiment is likely true in that it does help keep people employed, but if your goal is to have a working and economically viable fusion power plant within your lifetime it isn't a good way to go about things.
If the governments actually cared about CO2 and man-made global warming they would be investing in fusion technology and helping to develop ways to recycle nuclear waste usefully. Got to walk before you can run.
It's been over 20 years since ive dug into nuclear tech pretty deep but - don't we already have breeder reactors and other tech that is low waste, safer and thus we could build modern (not based on nuclear submarine) reactors in the fission category and deliver cleaner power, today? Yes there is a lot of politics especially around manufacturing, production and storage of spent fuel so all of those are probably show stoppers no matter how safe they are in reality but we aren't invested in it.
The primary purpose of the NIF is to maintain the US nuclear stockpile without nuclear tests. The lasers very inefficient (iirc about 2%). The success they claimed is that the energy released by the burning plasma exceeds the laser energy put into the fuel capsule. Since NIF was never intended to be a power plant they don't use the most efficient lasers.
It was never intended to be a power plant but it was hoped that it would achieve a net gain fusion reaction for the first time. This turned out to be a lot harder than expected.
Yes, after the test ban treaties, there was a huge push into exploring mathematical emulations of all aspects of fusion, and all assorted bombs, as well as laser ignition of pellets with these large lasers using inertial confinement of the pellet as the laser impacted it - analysing the fusion by observation of emitted neutrons. xrays etc. They issued reports from time to time(sanitised), and probably used the secret data to fine tune emulated weapons with fact points. The pellets were composed of potential fuels, various Hydrogens and Lithiums, varied in composition to explore the ignition space. A number of pellets performed well in terms of gain, but were far-far from useable fusion when the LL labs costs were factored in. I think they determined it could not ever work as a fusion energy source, but it provided data. They still mine data from it with various elemental mixes making up the pellets.
It should be noted that "breakeven" is often misleading.
There's "breakeven" as in "the reaction produces more energy than put into it", and there's breakeven as in "the entire reactor system produces more energy than put into it", which isn't quite the same thing.
It's always confused me a bit. It's not like if you put 10kWh into the reactor, that 10kWh goes away. You still lose a significant fraction of it in inefficiency of the cycle but it still goes towards heat which can be used to heat steam and turn a turbine. iirc, you can get about 4kWh back.
On the other side of the coin, if you put 10kWh in and get 10kWh of fusion out, that's 20kWh to run a steam turbine, which nets you about 8kWh. So really you need to be producing 15kWh of heat from fusion for every 10kWh you put in to break even.
Why is the last plot basically empty between 2000 and 2020? I understand that NIF was probably being built during that time, but were there no significant tokamak experiments in that time?
Author here - some other posters have touched on the reasons. Much of the focus on high performing tokamaks shifted to ITER in recent decades, though this is now changing as fusion companies are utilizing new enabling technologies like high-temperature superconductors.
Additionally the final plot of scientific gain (Qsci) vs time effectively requires the use of deuterium-tritium fuel to generate the amounts of fusion energy needed for an appreciable level of Qsci. The number of tokamak experiments utilizing deuterium tritium is small.
Thanks a lot for this research. Seing the comments here I think it's really important to make breakthroughs and progress more visible to the public. Otherwise the impression that "we're always 50 years away" stays strong.
In the 2037 timeframe, modeling trends doesn’t matter as much as looking at the actual players. I think odds are good because you have at least 4 very well funded groups shooting to have something before 2035: commercial groups including CFS, Helios, TAE, also the efforts by ITER. Maybe more. Each with generally independent approaches. I think scientific viability will be proven by 2035, but getting economic viability could take much longer.
Companies like Commonwealth Fusion Systems are an example of those utilizing high-temperature superconductors which did not exist commercially when ITER was being designed.
> The design operating current of the feeders is 68Ka. High temperature superconductor (HTS) current leads transmit the high-power currents from the room-temperature power supplies to the low-temperature superconducting coils 4K (-269°C) with minimum heat load.
HTS current feeds are a good idea (we also use them at CFS, my employer: https://www.instagram.com/p/DJXInDUuDAK/). It's HTS in the coils (electromagnets) that enables higher magnetic fields and thus a more compact tokamak.
Presumably because everyone in MCF has been waiting for ITER for decades, and JET is being decommissioned after a last gasp. Every other tokamak is considerably smaller (or similar size like DIII-D or JT-60SA).
Much of the interesting tokamak engineering ideas were on small (so low-power) machines or just concepts using high-temperature superconducting magnets.
The really depressing part is if you plot rate of new delays against real time elapsed, the projected finishing date is even further.
This is why much of the fusion research community feel disillusioned with ITER, and so are more interested in these smaller (and supposedly more "agile") machines with high-temperature superconductors instead.
Mind you, it's not useless! It produced a TON of very useful fusion research: neutral beam injectors, divertors, construction techniques for complex vacuum chambers, etc. At this point, I don't think it's going to be complete by the time its competitors arrive.
One spinoff of this is high-temperature superconductor research that is now close to producing actually usable high-TC flexible tapes. This might make it possible to have cheaper MRI and NMR machines, and probably a lot of other innovations.
> actually usable high-TC flexible tapes. This might make it possible to have cheaper MRI and NMR machines, and probably a lot of other innovations.
I'm sure there'll be plenty of fascinating applications of high-Tc tape, however I'm not sure MRI/NMR machines will be one of those. There would still be a lot of thermal noise due to the high temperature. Which is why MRI/NMR machines tend to use liquid helium cooling, not because superconductors capable of operating at higher temperatures don't exist.
ITER doesn't use high temperature superconductors. It uses niobium-tin and niobium-titanium low temperature superconductors in its magnets.
ITER has been criticized since early days as a dead end, for example because of its enormous size relative to the power produced. A commercial follow-on would not be much better by that power density metric, certainly far worse than a fission reactor.
There is basically no chance than a fusion reactor operating in a regime similar to ITER could ever become an economical energy source. And this has been known since the beginning.
I call things like ITER "Blazing Saddles" projects. "We have to protect our phony baloney jobs, gentlemen!"
I looked hopefully at the HR report https://www.iter.org/sites/default/files/media/2024-11/rh-20... to see if there was some sort of job categorisation - scientist, engineer, management. Disappointingly scant. PhD heavy. Perhaps the budget would be more insightful.
"Execution not ideas" is a common refrain for startups.
I wonder how much of the real engineering for ITER is occurring in subcontractors?
> ITER doesn't use high temperature superconductors.
It does, for high-current buses that interface with regular resistive power distribution. They are also planned for some auxiliary components (like the neutral beam injectors).
> ITER has been criticized since early days as a dead end, for example because of its enormous size relative to the power produced.
ITER is NOT designed for power generation. It's essentially a lab experiment to see how plasma behaves in magnetic confinement and test various technologies.
That's why ITER was designed with a very conservative approach to reduce the technical risk. We don't need it to be compact, this can come later. We just need it to work.
> ITER is NOT designed for power generation. It's essentially a lab experiment to see how plasma behaves in magnetic confinement and test various technologies.
That's the go-to excuse. But if you look at DEMO, it's power density is not enormously greater. ITER is so far out of the running that DEMO (or PROTO, etc.) will be too.
We're learning a great deal about something that's largely irrelevant.
DEMO concept sketches are completely obsolete at this point. It's not going to look anything like this.
They're based on the state-of-the art from about 2005. Since then, a lot of improvements happened. A more realistic power plant design is going to use a thinner center column (because of better superconducting magnets), resulting in a smaller cryostat volume. Possibly high-TC magnets.
It can also be made more compact, if neutral beams can be used to suppress some plasma instabilities.
This will probably need to be updated soon. There are rumors NIF recently achieved a gain of ~4.4 and ~10% fuel burn up. Being able to ignite more fuel is notable in and of itself.
Progress toward net fusion energy is critical for delivering fusion power on the grid. It's not the only progress required — the rest of the machine has to be economical to build and operate. Most of the fusion machines in this paper are scientific projects, but as commercialization progresses, fusion machines with power plant needs in mind should arrive.
(I work for one startup in the field, Commonwealth Fusion Systems. We're building our SPARC tokamak now to demonstrate net energy gain in a commercially relevant design.)
If you took all the money in the world being spent on fusion research right now you would struggle to build a single 1 GWe fission power plant. That doesn't sound like an improvement in resource allocation to me.
What's the ROI on that versus current and near-term expected pricing for solar+storage? Is fission getting safer/cheaper at the same rate that solar and batteries are?
Solar + days of storage is far more expensive than fission. Grid scale batteries like California has spent billions on only have 4 hour capacity. Fission can also supply heat that is needed for many industrial processes and chemical reactions.
it is not in most us areas. only problem is area covered, NOT price of technology. solar with 12 hour of storage was lower price than fission before covid hit. TCO, not one time nonsense.
fission has relatively low temperature heat, i.e. no metal reduction, no "concrete" production. you can cook hot dogs with it. also electrification of heat can provide lower losses stemming from regulation or lack thereof. with electricity you can say i need 293.5 degrees C and you just type it somewhere and you get it for almost free (regulation).
I am no fan of fission (I strongly oppose new fission plants). But one problem with solar+storage is that the cost of the storage component increases roughly linearly with the desired storage duration. That's not true of a fueled power plant (fission or fossil).
There are any problems with fission that are all related to the extraordinary danger of handling the fuel, byproducts, and the sites themselves.
The cost of them is huge, some people are hoping that modularity will help with construction, but it is still astonishingly expensive.
The problems of handling the fuel has been solved, in theory and practise. Except when commerce is involved. When the money people get involved corners will get cut, and we are back to incredible danger. Technically solvable, but I would not go near it. I have known too many business people.
The problem of the long-term waste is entirely beyond us. There has been no practical progress on this front. Long term waste (including some parts of the assemblies themselves) are very dangerous for hundreds of thousands of years.
This is, with current technology that can be bought to bear, unsolvable.
The only thing we can do is put it in a stable site, be ready to move it when the site becomes unstable (nowhere on Earth is known to be stable on such time scales), and find a way of communication, across thousands of generations, just how poisonous this stuff is.
Maybe our ancestors will get lucky and find a way to safely dispose of it....
So fission power is making future generations pay for today's consumption.
Fortunately for us it is moot. The costs of renewables is dropped to the point that the only reason for fission is to build the capacity for nuclear weapons.
And there is still very much a need for zero-carbon DISPATCHABLE electricity of witch nuclear is the ONLY choice. You simply cannot have 100% of your electricity from only solar and wind because it is far too variable and we simply don't have the technology to store electricity cheaply enough.
Your attitude towards nuclear energy is as irrational as the average antivaxer towards vaccines.
Lithium ion batteries are light with a high energy density, so are great for cars.
Flow batteries have a low energy density, but increasing the duration means a bigger tank, and the cost of bigger tanks increases as a function of the cube root (?) of their volume
Flow batteries are well over a century old, but I have been reading about improvements over the last two decades. Where are they?
Having trouble staying ahead of the enormous monster that is the lithium battery industry which through sheer scale are lowering the costs allowing it to break into one market after another.
It is the good old: Good enough beats theoretically perfect.
There are a number of flow batteries, where they have large vats where charge is stored in 2 discrete charge state fluids in a redox reaction. They charge a vat through a cell and discharge it in the other direction. Limits are solubility of the charge states in the transport fluid = huge vats for total watt-hours and huge redox cells for rate of charge/discharge. Runs well and vats are cheap.
https://en.wikipedia.org/wiki/Flow_battery
If you don’t need a mobile power plant why bother with fusion power instead of something like geothermal? At the end of the day we’re just turning water into steam.
Maybe someday we’ll finally achieve the ultimate dream: an extremely expensive nuclear power plant that needs vast amounts of coolant water and leaves radioactive waste behind.
Tossing out your opinions as fact doesn't do much to win hearts and minds, or educate us bystanders to the basis for your point of view.
Presumably your comment is either to persuade or to inform; it does neither. I'm very curious about this field and its future, do you care to try again?
ITER began building in 2013, first plasma is expected for 2034. DEMO is expected to start in 2040.
So, ITER is taking an estimated 20 years. It's being built for a reason, so I imagine follow-ups want to wait to see how that shakes out. So certainly, DEMO needs to start a few years after ITER is finally done.
Then DEMO isn't a production setup either, it's going to be the first attempt at a working reactor. So let's say optimistically 20 years is enough to build DEMO, run it for a few years, see how it shakes out, design the follow-ups with the lessons learned.
That means the first real, post-DEMO plant starts building somewhere in 2060. Yeah, fair to say a lot of the here present will be dead by then, and that'll only be the slow start of grid fusion if it sticks at all. Nobody is going to just go and build a hundred reactors at once. They'll be built slowly at first unless we somehow manage to start making them amazingly quickly and cheaply.
So that's what, half a century? By the time fusion gets all the kinks worked out, chances are it'll never be commercially viable. Renewables are far faster to build, many problems are solvable by brute force, and half a century is a lot of time to invent something new in the area.
ITER/DEMO is an exceptionally slow fusion project and arguably obsolete since it uses older superconductors. CFS uses the same design, with modern superconductors that can support much stronger magnetic fields. Tokamak output scales with the fourth power of magnetic field strength, so this should let them get results similar to ITER in a reactor a tenth the size. They'll have it running long before ITER is ready.
If Jesus Christ himself came to earth and hand delivered a durable and workable reactor design WITH high uptime WITH a near-optimal confinement scheme WITH zero neutronicity AND he included a decade of free perfectly packaged and purified fuel, it would still not pencil out as anything other than water-hungry staff-intensive baseload requiring significant state support.
This is the reality. It’s not happening. It’s a welfare program for bullshit artists that depends on a credulous public.
I am in the business of baiting militantly uninformed enthusiasts who form the foundation of the multigenerational grift that is Commercial Fusion Power.
Real talk, the point is not that whatever system is first past the post for fusion becomes the gold standard and fills the planet.
The issue right now is cracking the code. Once that is done, performance gains and miniaturization can take place.
Fusion can work on lots of things. Its possible that a fusion system the size of a car could be made within 25 years of the code being cracked that would power a house, or the size of a small building that could power a city block.
The waste product of hydrogen fusion is helium, a valuable resource that will always be in high demand, and it will not be radioactive.
And yes, it will need coolant as with hot fusion the system uses the heat to turn a turbine, but that coolant isn't fancy, it's just water.
Fusion has the potential to solve more problems than it causes by every metric as long as it is doable without extremely limited source materials, and this is what these big expensive reactors are trying to solve.
You’ve disputed nothing I’ve said and unless a dramatically higher temperature fusion reaction that does not generate a neutron flux is achieved, it will generate radioactive waste as a matter of factual physics. Thank you though!
I mean, yes, you're right, but it's not a permanently radioactive waste.
Quote:
A fusion power plant produces radioactive waste because the high-energy neutrons produced by fusion activate the walls of the plasma vessel. The intensity and duration of this activation depend on the material impinged on by the neutrons.
The walls of the plasma vessel must be temporarily stored after the end of operation. This waste quantity is initially larger than that from nuclear fission plants. However, these are mainly low- and medium-level radioactive materials that pose a much lower risk to the environment and human health than high-level radioactive materials from fission power plants. The radiation from this fusion waste decreases significantly faster than that of high-level radioactive waste from fission power plants. Scientists are researching materials for wall components that allow for further reduction of activation. They are also developing recycling technologies through which all activated components of a fusion reactor can be released after some time or reused in new power plants. Currently, it can be assumed that recycling by remote handling could be started as early as one year after switching off a fusion power plant. Unlike nuclear fission reactors, the long term storage should not be required.
Basically, whatever containment vessel becomes standard for the whole fusion industry would need probably an annual cycle of vessel replacements, which would be recycled indefinitely and possibly mined for other useful radioactive byproducts in the process.
The amount of radioactive scrap produced by hypothetical decommissioned radioactive fusion containment vessels is laughably trivial compared to fission waste streams. Even accounting for the most pessimistic irradiation models of first-wall materials, the total radioactive burden remains orders of magnitude below legacy technologies.
The half-lives of such activated components like predominantly steel alloys and ceramic composites trend dramatically shorter than actinide-laden spent fuel, with activity levels plummeting to background within mere decades rather than geological timescales. This makes waste management a single-generation engineering challenge rather than a multi-millennial obligation
The long term activity of the waste is certainly lower, but the volume of the waste is likely much higher. And much of the cost is driven by volume, not activity.
As a species, we're spectacularly bad at negative externalities.
We are also very bad at anything very long term. We've hardly pulled off any physical project to last more than one generation recently. We barely invest in any.
The winning energy tech of the future better have as little negative externalities as possible, especially long term ones.
Hey, there it is! Lots of radioactive waste being generated on a continuous business but maybe baby with dreams and creams we can decommission it with robots and recycle it all. Meanwhile a reactor is offline for refurbishment for days, weeks, months, blowing a hole in the economics of it all.
Unironically: you’re the first person I’ve come across to openly acknowledge this issue. Thank you.
Somewhat relevant, folks here might also be interested in a whitepaper we recently put up on arXiv that describes what we are doing at Pacific Fusion: https://arxiv.org/abs/2504.10680
Section 1 in particular gives some extra high-level context that might be useful to have while reading Sam and Scott's update, and the rest of the paper should also be a good introduction to the various subsystems that make up a high-yield fusion demonstration system (albeit focused on pulser-driven inertial fusion).
The idea of using literal guns (gunpowder, then light gas gun, then coil gun) to impact projectiles against each other seemed like it was probably ludicrous, but I haven't seen any critical media or numbers yet.
Any truth to that?
Modern lasers can also repeat shots much more quickly. Power gain on the capsules appears to scale faster than linear with the input power, so getting to practical gain might not be as far off as it appears at first glance.
These are some of the reasons that various fusion startups are pursuing laser fusion for power plants.
Btw, NIF achieved those recent results by adding strong magnetic field around the target (penny-shrinkers knew that tech for 20+ years :). There are other things like this around that can potentially be similarly useful. Only if somebody had money and interest ...
https://lasers.llnl.gov/about/what-is-nif
>NIF is a key element of the National Nuclear Security Administration’s science-based Stockpile Stewardship Program to maintain the reliability, security, and safety of the U.S. nuclear deterrent without full-scale testing.
So it seems more likely to me that some physicists figured out how to get their fusion power research funded under the guise of weapons research, since that's where the money is. NIF's original intent was mostly weapons research but it's turned out to be really useful for both, and these days, various companies are attempting to commercialize the technology for power plants.[3]
[1] https://theaviationist.com/2025/04/26/us-nuclear-weapons-wil...
[2] https://www.fusionindustryassociation.org/congress-provides-...
[3] NYTimes: https://archive.is/BCsf5
The purpose of it is to show that the USA is still capable of producing advanced hydrogen bombs. More advanced then anybody else.
The '2.05 megajoules' is only a estimation of the laser energy actually used to trigger the reaction. It ignores how much power it took to actually run the lasers or reactor. Even if they update the lasers with modern ones there is zero chance of it ever actually breaking even. It is a technological dead end as far as power generation goes.
The point of the 'breakthrough' is really more about ensuring continued Congressional approval for funding then anything else. They are being paid to impress and certainly they succeeded in that.
However I suspect this is true of almost all 'fusion breakthroughs'. They publish updates to ensure continued funding from their respective governments.
People will argue that this is a good thing since it helps ensure that scientists continue to be employed and publishing research papers. That sentiment is likely true in that it does help keep people employed, but if your goal is to have a working and economically viable fusion power plant within your lifetime it isn't a good way to go about things.
If the governments actually cared about CO2 and man-made global warming they would be investing in fusion technology and helping to develop ways to recycle nuclear waste usefully. Got to walk before you can run.
There's "breakeven" as in "the reaction produces more energy than put into it", and there's breakeven as in "the entire reactor system produces more energy than put into it", which isn't quite the same thing.
On the other side of the coin, if you put 10kWh in and get 10kWh of fusion out, that's 20kWh to run a steam turbine, which nets you about 8kWh. So really you need to be producing 15kWh of heat from fusion for every 10kWh you put in to break even.
Availability (reliability engineering) https://en.wikipedia.org/wiki/Availability
Terms from other types of work: kilowatt/hour (kWh), Weight per rep, number of reps, Total Time Under Tension
Additionally the final plot of scientific gain (Qsci) vs time effectively requires the use of deuterium-tritium fuel to generate the amounts of fusion energy needed for an appreciable level of Qsci. The number of tokamak experiments utilizing deuterium tritium is small.
Here was my completely layman attempt to forecast fusion viability a few months ago. https://news.ycombinator.com/item?id=42791997 (in short: 2037)
Is there some semblance of realism there you think?
> The design operating current of the feeders is 68Ka. High temperature superconductor (HTS) current leads transmit the high-power currents from the room-temperature power supplies to the low-temperature superconducting coils 4K (-269°C) with minimum heat load.
Source: https://www.iter.org/machine/magnets
Much of the interesting tokamak engineering ideas were on small (so low-power) machines or just concepts using high-temperature superconducting magnets.
There's the common joke that fusion is always 30 years away, but now with the help of ITER, it's always 10 years away instead.
This is why much of the fusion research community feel disillusioned with ITER, and so are more interested in these smaller (and supposedly more "agile") machines with high-temperature superconductors instead.
Mind you, it's not useless! It produced a TON of very useful fusion research: neutral beam injectors, divertors, construction techniques for complex vacuum chambers, etc. At this point, I don't think it's going to be complete by the time its competitors arrive.
One spinoff of this is high-temperature superconductor research that is now close to producing actually usable high-TC flexible tapes. This might make it possible to have cheaper MRI and NMR machines, and probably a lot of other innovations.
I'm sure there'll be plenty of fascinating applications of high-Tc tape, however I'm not sure MRI/NMR machines will be one of those. There would still be a lot of thermal noise due to the high temperature. Which is why MRI/NMR machines tend to use liquid helium cooling, not because superconductors capable of operating at higher temperatures don't exist.
ITER has been criticized since early days as a dead end, for example because of its enormous size relative to the power produced. A commercial follow-on would not be much better by that power density metric, certainly far worse than a fission reactor.
There is basically no chance than a fusion reactor operating in a regime similar to ITER could ever become an economical energy source. And this has been known since the beginning.
I call things like ITER "Blazing Saddles" projects. "We have to protect our phony baloney jobs, gentlemen!"
I looked hopefully at the HR report https://www.iter.org/sites/default/files/media/2024-11/rh-20... to see if there was some sort of job categorisation - scientist, engineer, management. Disappointingly scant. PhD heavy. Perhaps the budget would be more insightful.
"Execution not ideas" is a common refrain for startups.
I wonder how much of the real engineering for ITER is occurring in subcontractors?
It does, for high-current buses that interface with regular resistive power distribution. They are also planned for some auxiliary components (like the neutral beam injectors).
> ITER has been criticized since early days as a dead end, for example because of its enormous size relative to the power produced.
ITER is NOT designed for power generation. It's essentially a lab experiment to see how plasma behaves in magnetic confinement and test various technologies.
That's why ITER was designed with a very conservative approach to reduce the technical risk. We don't need it to be compact, this can come later. We just need it to work.
And yes, it is necessary. Plasma behavior can't be simulated numerically or analytically. It always provides surprises, sometimes even good ones: https://en.wikipedia.org/wiki/High-confinement_mode
That's the go-to excuse. But if you look at DEMO, it's power density is not enormously greater. ITER is so far out of the running that DEMO (or PROTO, etc.) will be too.
We're learning a great deal about something that's largely irrelevant.
They're based on the state-of-the art from about 2005. Since then, a lot of improvements happened. A more realistic power plant design is going to use a thinner center column (because of better superconducting magnets), resulting in a smaller cryostat volume. Possibly high-TC magnets.
It can also be made more compact, if neutral beams can be used to suppress some plasma instabilities.
(it's been 30 years away for 50 years already, but as long as I'm not dead 30 years from now, it's still a good investment...)
https://www.metaculus.com/questions/9464/nuclear-fusion-powe...
I want to believe, but this does not make that easier.
(I work for one startup in the field, Commonwealth Fusion Systems. We're building our SPARC tokamak now to demonstrate net energy gain in a commercially relevant design.)
fission has relatively low temperature heat, i.e. no metal reduction, no "concrete" production. you can cook hot dogs with it. also electrification of heat can provide lower losses stemming from regulation or lack thereof. with electricity you can say i need 293.5 degrees C and you just type it somewhere and you get it for almost free (regulation).
There are any problems with fission that are all related to the extraordinary danger of handling the fuel, byproducts, and the sites themselves.
The cost of them is huge, some people are hoping that modularity will help with construction, but it is still astonishingly expensive.
The problems of handling the fuel has been solved, in theory and practise. Except when commerce is involved. When the money people get involved corners will get cut, and we are back to incredible danger. Technically solvable, but I would not go near it. I have known too many business people.
The problem of the long-term waste is entirely beyond us. There has been no practical progress on this front. Long term waste (including some parts of the assemblies themselves) are very dangerous for hundreds of thousands of years.
This is, with current technology that can be bought to bear, unsolvable.
The only thing we can do is put it in a stable site, be ready to move it when the site becomes unstable (nowhere on Earth is known to be stable on such time scales), and find a way of communication, across thousands of generations, just how poisonous this stuff is.
Maybe our ancestors will get lucky and find a way to safely dispose of it....
So fission power is making future generations pay for today's consumption.
Fortunately for us it is moot. The costs of renewables is dropped to the point that the only reason for fission is to build the capacity for nuclear weapons.
reprocess the dirty fuel and bury the actual waste deep underground like Finland is doing at the Onkalo spent nuclear fuel repository.
https://en.wikipedia.org/wiki/Onkalo_spent_nuclear_fuel_repo...
And there is still very much a need for zero-carbon DISPATCHABLE electricity of witch nuclear is the ONLY choice. You simply cannot have 100% of your electricity from only solar and wind because it is far too variable and we simply don't have the technology to store electricity cheaply enough.
Your attitude towards nuclear energy is as irrational as the average antivaxer towards vaccines.
How deep, to stay put thousands of generations?
Lithium ion batteries are light with a high energy density, so are great for cars.
Flow batteries have a low energy density, but increasing the duration means a bigger tank, and the cost of bigger tanks increases as a function of the cube root (?) of their volume Flow batteries are well over a century old, but I have been reading about improvements over the last two decades. Where are they?
It is the good old: Good enough beats theoretically perfect.
Presumably your comment is either to persuade or to inform; it does neither. I'm very curious about this field and its future, do you care to try again?
ITER began building in 2013, first plasma is expected for 2034. DEMO is expected to start in 2040.
So, ITER is taking an estimated 20 years. It's being built for a reason, so I imagine follow-ups want to wait to see how that shakes out. So certainly, DEMO needs to start a few years after ITER is finally done.
Then DEMO isn't a production setup either, it's going to be the first attempt at a working reactor. So let's say optimistically 20 years is enough to build DEMO, run it for a few years, see how it shakes out, design the follow-ups with the lessons learned.
That means the first real, post-DEMO plant starts building somewhere in 2060. Yeah, fair to say a lot of the here present will be dead by then, and that'll only be the slow start of grid fusion if it sticks at all. Nobody is going to just go and build a hundred reactors at once. They'll be built slowly at first unless we somehow manage to start making them amazingly quickly and cheaply.
So that's what, half a century? By the time fusion gets all the kinks worked out, chances are it'll never be commercially viable. Renewables are far faster to build, many problems are solvable by brute force, and half a century is a lot of time to invent something new in the area.
ARC, which uses those high temperature superconductors, is just 40x lower power density.
Neither promises to be competitive with fission, never mind the things beating fission.
This is the reality. It’s not happening. It’s a welfare program for bullshit artists that depends on a credulous public.
The issue right now is cracking the code. Once that is done, performance gains and miniaturization can take place.
Fusion can work on lots of things. Its possible that a fusion system the size of a car could be made within 25 years of the code being cracked that would power a house, or the size of a small building that could power a city block.
The waste product of hydrogen fusion is helium, a valuable resource that will always be in high demand, and it will not be radioactive.
And yes, it will need coolant as with hot fusion the system uses the heat to turn a turbine, but that coolant isn't fancy, it's just water.
Fusion has the potential to solve more problems than it causes by every metric as long as it is doable without extremely limited source materials, and this is what these big expensive reactors are trying to solve.
Quote:
A fusion power plant produces radioactive waste because the high-energy neutrons produced by fusion activate the walls of the plasma vessel. The intensity and duration of this activation depend on the material impinged on by the neutrons.
The walls of the plasma vessel must be temporarily stored after the end of operation. This waste quantity is initially larger than that from nuclear fission plants. However, these are mainly low- and medium-level radioactive materials that pose a much lower risk to the environment and human health than high-level radioactive materials from fission power plants. The radiation from this fusion waste decreases significantly faster than that of high-level radioactive waste from fission power plants. Scientists are researching materials for wall components that allow for further reduction of activation. They are also developing recycling technologies through which all activated components of a fusion reactor can be released after some time or reused in new power plants. Currently, it can be assumed that recycling by remote handling could be started as early as one year after switching off a fusion power plant. Unlike nuclear fission reactors, the long term storage should not be required.
https://www.ipp.mpg.de/2769068/faq9
Basically, whatever containment vessel becomes standard for the whole fusion industry would need probably an annual cycle of vessel replacements, which would be recycled indefinitely and possibly mined for other useful radioactive byproducts in the process.
We are also very bad at anything very long term. We've hardly pulled off any physical project to last more than one generation recently. We barely invest in any.
The winning energy tech of the future better have as little negative externalities as possible, especially long term ones.
Unironically: you’re the first person I’ve come across to openly acknowledge this issue. Thank you.