I might not have bought NVDA or timed BTC correctly, but at least I have 512 GB of DDR5 in my server and 128 GB in my Macbook Pro haha. The reality is that these are insanely huge amounts of RAM. I'm glad to have them because I don't need these tab suspender extensions a bunch of my friends use, but really I'd prefer if GPUs were a bit cheaper, and server hardware was generally easier to get. An SXM5 based motherboard is really hard to get these days despite the fact that you can get super powered Epyc 9755s for comparatively nothing.
It reminds me of the heady days of Thai floods when hard drives were inaccessible.
I had my formative years in programming when memory usage was something you still worried about as a programmer. And then memory expanded so much that all kinds of “optimal” patterns for programming just become nearly irrelevant. Will we start to actually consider this in software solutions again as a result?
> And then memory expanded so much that all kinds of “optimal” patterns for programming just become nearly irrelevant.
I don't think that ever happened. Using relatively sparse amount of memory turns into better cache management which in turn usually improves performance drastically.
And in embedded stuff being good with memory management can make the difference between 'works' and 'fail'.
You're right in terms of fitting your program to memory, so that it can run in the first place.
But in performance work, the relative speed of RAM relative to computation has dropped such that it's a common wisdom to treat today's cache as RAM of old (and today's RAM as disk of old, etc).
In software performance work it's been all about hitting the cache for a long time. LLMs aren't too amenable to caching though.
I never really bought in to the anti-Leetcode crowd’s sentiment that it’s irrelevant. It has always mattered as a competitive edge, against other job candidates if you’re an employee or the competition of you’re a company. It only looked irrelevant because opportunities were everywhere during ZIRP, but good times never last.
Yes, it's a nefarious plot of AI producers to attempt a monopoly with a product that no one seems capable of demonstrating has the exponential value they're betting on.
i am working on my side-product [1] where i was exploring a Rockchip which required external memory (just 1G) which went from $3 to $32 and completely destroyed economics for me. I settled with one with embedded memory and optimizing my code instead :)
I suspect game development will be similar - game companies will optimize their games given customer cards are not going to be released for a while or will be too expensive.
Not really new, Nvidia's GTX 1070 launched in 2016 with 8 GB of VRAM and they've been slow walking VRAM increases for the last decade.
Today's RTX 5060 has 8 GB for basically the same price that the 1070 did.
For $650 you can go up to 12 GB in the 5070, if you want 16 GB it's $1000 for the 5070 Ti, or hundreds more than that for the 5080.
I know there's inflation and $380 in 2016 was more money than it is today, but if you'd asked me 10 years ago I would've bet on VRAM capacity doing better than "the same money is worth less but still gets you exactly same amount of memory 10 years from now."
With prices going up, I half expect Nvidia to launch the RTX 6070 and tell everyone "It has 4 GB of memory and we think you're going to love it. $900." Or they'll just stop bothering with consumer GPUs entirely.
Resource usage has been on a hedonic treadmill at least since I came online in the 90s. Good things have come from that, of course, but there's also plenty of abstraction/waste that's permitted because "new computers can handle it."
With so many gaming devices based on the AMD Z1 Extreme platform (and its custom Valve corollaries) over the past few years, it'll be great to see that be the target/baseline for a while. Brings access to more players and staves of e-waste for longer.
I'm not sure how we got on to games as resource hogs when Teams uses 2GiB of RAM and Windows itself uses 4GiB of RAM.
I work in gamedev, so perhaps I'm a bit sensitive, and I understand that general purpose engines aren't as light on resources as the handcrafted ones that nobody can afford to make anymore... but we're not anywhere close to the layers of waste and abstraction that presents itself when using webtech for desktop apps by default.
I think Europe should invest into manufacturing RAM. RAM isn't going anywhere, all of modern compute uses it. This would be an opportunity to create domestic supply of it.
The worry is that these high prices aren't going to last long. And by the time you spend years building the capacity, the prices plummet making your facility uneconomical to run.
Ram will always be in some demand, but that doesn't mean it's viable for everyone to start building production.
The only way they will is if the hyperscalers and AI companies start to implode -- which will kill a huge portion of the US economy and lead to global recession, so, cheap RAM but nobody can afford it
2) By building up capacity you influence the outcome.
If someone else enters the DRAM space, the duopoly has to actually start thinking about competing on price, maybe they become price competitive before the launch of your new fab in order to kill it, but, it will have an effect and probably before it even opens
3) A western supply chain has benefits by itself.
There's a reason some industries are not allowed to die, most notably farming- because security and external pressure are concerning.
---
Realistically there's no reason not to do this. It will be long, painful and expensive. The best time was a decade ago. The next best time is now.
> The only way they will is if the hyperscalers and AI companies start to implode
you're missing the picture that it's not companIES - the crisis primarily was caused by only one company OpenAI buying out wafers
but even more than that - that wafer buyout is *an excuse* used by cartel - there are several mechanisms that could have eased out most of the problem (e.g. Samsung selling old equipment) that was not done to ride the money wave
the winners will not be the ones who build new fabs - but ones who'll have enough money and government subsidies/import taxes to protect such investments after cartel decides to oversupply again, flushing the price down
Prices are returning to normal, probably 2-3 years from now. SK Hynix is making absolutely monstrous investments in memory fabs and CMXT will be entering the market in force more and more.
The biggest problem is that the industry wants HBM, whereas consumers want DRAM. Until the need for HBM has been sufficiently satisfied, fabs will prefer being tooled for HBM because businesses can be squeezed much harder than consumers.
Then again, as consumer you don't really need DDR5 or even DDR4 so long as you aren't using an iGPU. Its all about being around CL15 timings.
Your first point highlights the huge unmitigated risk. There is no guarantee that this won't all implode, triggering a huge recession. And even if no one can afford the ram after, they especially won't be able to afford the more expensive European ram.
Really the only way it could work is if the government declares it it a national security issue and will promise to subsidize it. Because in just a free market, it's most likely to flop.
> The only way they will is if the hyperscalers and AI companies start to implode -- which will kill a huge portion of the US economy and lead to global recession, so, cheap RAM but nobody can afford it
I disagree.
Modern RAM is made in fabs, which are ridiculously expensive to manufacture. Modern EUV lithography machines cost around 500M each. They're manufactured by hand. Only one company in the world knows how to manufacture them right now. So we can't exactly increase global manufacturing capacity overnight.
The way I see, there's 2 ways this goes:
1. AI is a fad. RAM and storage demand falls. Prices drop back to normal.
2. AI is not a fad. Over time, more and more fabs come online to meet the supply needs of the AI industry. The price comes down as manufacturing supply increases.
Or some combination of the two.
The high prices right now are because there's a demand shock. There's way more demand for RAM than anyone expected, so the RAM that is produced sells at a premium. High prices aren't because RAM costs more to manufacture than it did a couple years ago. There's just not enough to go around. In 5-10 years, manufacturing capacity will match demand and prices will drop. Just give it time.
FWIW I agree with you. The US should provide stable, consistent policy & funding so companies understand the regulatory environment and do long-term planning.
Which is a good idea for when we don't have a dementia patient in charge of our country.
AI demand isn't going away. It will just move from the data center to the local machine. On device AI is much better for the customer than it being in the cloud. Expecting people to stick with a few dozen gb of hbm is going to be the 'no one needs more than 640kb' of the 2030s.
Not everyone but a supplier in the Europe would be a massive benefit long after the AI driven demand dies off. It'd free them from dependence on other countries for a critical resource making chips more affordable and the supply more stable which is good because the stability of the rest of the world is already questionable and big shocks are expected in the near future.
Aren’t Chinese manufacturers already expanding their capacity? Given that Samsung and SK Hynix have left that market in the pursuit of HBM4 chips, China is going to rule this market. At least that’s what analysts are saying.
Chinese manufacturers like CXMT face the same kinds of issues that Huawei faced in entering the EU market - the EU is clamping down on Chinese suppliers across their supply chain [0].
Where can CXMT and other Chinese players export when Japan, South Korea, much of ASEAN, India, much of North America, the EU, the UK, Australia, NZ, and parts of the Gulf have enacted or begun enacting trade barriers against Chinese exports?
RAM isn't a critical security category like 5G base stations.
Also, I don't think you've seen true consumer rage until the opposition in the EU would start pointing out the current parties are making the smartphones, laptops, TVs and whatnot consumers wanna buy much more expensive (or more crappy). Large parts of the EU are currently being crushed by one of the worst housing crises in the world, the economy seems to be wavering for young people especially, and tech / gadgets being cheap was one of the sole rays of light left.
Or China could stop antagonizing blocs like the EU by solidifying ties with Russia [0][1][2], imposing rare earth export restrictions on the EU [3], and undermining EU institutions [4].
Ahh, it is always China antagonizing others, isn't it?
There's nothing wrong EU coming to absurd lengths in all directions that are leading to destruction of its economy and society only to please the narrative of few degraded groups of individuals. Yet, it is others that are the cause. Nice, easy story telling.
All governments in the world turned to be on the dark side. But some are reaching new heights.
Idea: Take the money that Germany promised to Intel if they build a state of the art fab. Instead, ask SK Hynix, Samsung, or Micron to build a DRAM fab in Germany.
It may seem that these are very similar processes, but this is only if you do not take into account the bribes from Intel to specific officials and their relatives who make decisions about subsidizing Intel.
SK Hynix, Samsung, or Micron don't treat good people well enough to give them taxpayer money.
> I think Europe should invest into manufacturing RAM ... This would be an opportunity to create domestic supply of it
How?
Most foundries across Asia and the US are being given subsidizes that outstrip those that the EU is providing, with the only mega-foundry project in Europe was canceled by Intel last year [0].
Additionally, much of the backend work like OSAT and packaging is done in ASEAN (especially Malaysia), Taiwan, China, and India. As much of the work for memory chips is largely backend work (OSAT and packaging), this is a field the EU simply cannot compete in given that it has FTAs with the US, Japan, South Korea, India, and Vietnam so any EU attempt would be crushed well before imitating the process.
Furthermore, much of the IP in the memory space is owned by Korean, Japanese, Taiwanese, Chinese, and American champions who are largely investing either domestically or in Asia, as was seen with MUFG's announcement earlier today to create a dedicated end-to-end semiconductor fund specifically to unify Japan, Taiwan, and India into a single fab-to-fabless ecosystem [1]. SoftBank announced something similar to unify the US, Japan, Malaysia, and India into a similar end-to-end ecosystem as well a couple weeks ago [2]. Meanwhile, South Korea is trying to further shore up their domestic capacity [3] via subsidies and industrial policy.
When Japanese, Korean, and Taiwanese technology and capital partners are uninterested in investing in building European capacity, American technology and capital partners have pulled out of similar initiatives in Europe, and the EU working to ban Chinese players [4] what can the EU even do?
----
Edit: can't reply
> Why are you overlooking European semiconductor champions
Because they don't have the IP for the flash memory supply chain. And whatever capacity and IP they have in chip design, front-end fab, or back-end fab is domiciled in the US, ASEAN, and India.
> STMicroelectronics
Power electronics and legacy nodes (28nm and above) for IoT and embedded applications.
> Infineon
Power electronics and legacy nodes (28nm and above) for automotive applications.
> NXP
Power electronics and legacy nodes (28nm and above) for embedded applications.
> All of them are skilled enough to build and operate a DRAM fab in Europe. A bunch of EU dev banks can lend the monies to get it built.
They don't have the IP. Much of the IP for the memory space is owned by Japanese, American, Korean, Taiwanese and Chinese companies.
Additionally, most Asian funds own both the IP and capital (often with government backing), making European attempts futile.
Essentially, the EU would have to start from scratch and decades behind countries with whom the EU already has FTAs with that have expanded capacity well before the EU and thus would be able to crush any incipient European competitor.
Why are you overlooking European semiconductor champions? STMicroelectronics, Infineon Technologies, and NXP Semiconductors. All of them are skilled enough to build and operate a DRAM fab in Europe. A bunch of EU dev banks can lend the monies to get it built.
> I think Europe should invest into manufacturing RAM. RAM isn't going anywhere, all of modern compute uses it. This would be an opportunity to create domestic supply of it.
It's easy to build factories, much more difficult to train the engineers required to run them... and let's not even talk about all the crazy regulations & environmental rules at the EU level that make that task even more difficult, because yes, chip factories do pollute... a lot.
Countries like South Korea or Taiwan have adapted all their legislations and tax, environmental regulations to allow such factories to operate easily. The EU and EU countries will never do that... better outsource pollution and claim they care about the planet...
I am a CAD engineer and software developer who has worked in manufacturing a lot in the UK in various industries - products as big as superyachts and as small as peristaltic pumps. I think if the UK and EU are to try and defend their weakening and shrinking manufacturing sectors (these industries have been disappearing for my entire adult life) then it is possible but difficult...In 10 to 20 years it will be impossible.
The reason is as you have described. We are getting close to where the numbers of people with practical experience working in, managing, and designing things like the work processes and factory layouts in industries that build physical products are disappearing. We're losing a lot of capable practical engineers with hands on experience. We can keep the universities going teaching the physical subjects but those lecturers wouldn't know even where to begin on designing and building efficient factories unfortunately.
We'd probably end up having to get Chinese and Taiwanese businesses to outsource their 'experts' back to us in order to actually do this and pay them a fortune - basically the reverse of what was happening in the manufacturing sector in the 80s and 90s!
Even the most excellent education system takes several yeas to educate a high-schooler to a level of a junior engineer. Then several more years are needed for the best of them to become senior engineers, with the knowledge and experience that a university alone cannot provide.
So, we're looking at a decade-long project at least, even if everything goes as planned, and crazy fast, in the technical and administrative departments.
All the more reason to start now I guess. Putting it off isn't going to get them that knowledge and experience any sooner. If something happens over the next 10 years that eliminates our need for memory chips things will probably be either too messed up or too wonderful for anyone to cry over the years they needlessly spent trying to secure a domestic source of RAM.
> Doesn’t the EU have an excellent education system?
Excellent universities, overall. But results from primary and secondary schools are nose diving at a more than alarming rate in several EU countries. Literacy rates are falling, math grades are falling. There's IMO only so much time before universities begin to be affected as well.
> Doesn’t the EU have an excellent education system?
Well, the EU has not manufactured a whole lot of chips in the last 30 years, where do you get the people with the professional experience to teach new engineers... Oh you mean you have to import the teachers from South Asia too? /s and it takes what, 5 years at the minimum to train an engineer? France and UK used to produce entire home computers... in the 80's...
Come on, STM, Nordic, Infineon, NXP are all European. There is a bunch of chip-making installations in Dresden, Germany (Global Foundries, Bosch, etc), and there's Intel Fab 34 in Ireland. BTW TSMC is planning to open a production facility in Europe in 2027.
This is not comparable to Taiwan or the Shenzen area, but it's definitely not nothing. Some local expertise exists, even though it may be not the most cutting-edge.
ASML, which is based in the Netherlands, produces chip-making machines which TSMC and everyone else use to produce said chips. I think they got some expertise too :)
Only a matter of time before you hear about missing shipping trucks being stolen. China is opening up more production, but I don’t see any relief coming soon.
The joke is that Apple RAM pricing is now close to market level, they still have margin in there even at market prices, and they are notorious for supply chain management and locking in contracts/prices ahead of time. So doubt Apple will change anything here short term.
On the flip side if you're buying a new computer in 2026 - it's going to be even harder to justify not getting a MacBook, the chips are already 2 years ahead of PC, the price of base models was super competitive, now that the ram is super expensive even the upgraded versions are competitive with the PC market. Oh and Windows is turning to an even larger pile of shit on a daily basis.
Isn't there a full wafer ai chip mainframe for data centers now that blows anything needing ram out of the water?
I don't understand the ram shortage exists companies have surpassed nvidia.
This is a fairly odd statement given that BOMs are managed in manufacturing systems and for accounting and engineering purposes in multiple different ways. This can be for anything to do with sales data for a client or for guys on the factory floor or for the accountants. There are sales BOMs, manufacturing BOMs procurement BOMs and nested BOMs etc all for different parts of the business process...you would have BOMs within the organisation that were probably nearly 70% etc or those that were 0%!
I read that Apple will start feeling the heat in the third quarter of this year although nobody knows for sure. That will either shrink their margins a bit or iPhones prices will go up.
The Verge had some good coverage about this, but TLDR: probably. Flagship phones may not raise in price to fully reflect it. They might cut costs elsewhere like keeping the same camera, or eat some of the cost increase in their margin.
Most base level smartphones are loss leaders and wouldn't be severely impacted and upper-tier smartphones tend to be priced at their true value. It's the mid-tier SKUs that get impacted.
Additonally, depending on which country you live in, telecom vendors reduce the upfront cost of the phone purchase and make up the difference via contracts.
Recently order a number of machines with 32Gb of RAM. Wanted 64, was told prices couldn't be guaranteed nor could delivery dates. Under the pressure of urgency settled for whatever was available that day.
I think China is about to step in and take every last bit of non-ai market share, and then when the bubble bursts companies like micron and samsung are going to be begging governments for a bail out.
I think we’re at the peak, or close to it for these memory shenanigans. OpenAI who is largely responsible for the shortage, just doesn’t have the capital to pay for it. It’s only a matter of time before chickens come home to roost and the bill is due. OpenAI is promising hundreds of billions in capex but has no where near that cash on hand, and its cash flow is abysmal considering the spend.
Unless there is a true breakthrough, beyond AGI into super intelligence on existing, or near term, hardware— I just don’t see how “trust me bro,” can keep its spending party going. Competition is incredibly stiff, and it’s pretty likely we’re at the point of diminishing returns without an absolute breakthrough.
The end result is going to be RAM prices tanking in 18-24 months. The only upside will be for consumers who will likely gain the ability to run much larger open source models locally.
Connectix was a big deal in its day. RAM Doubler was considered essential software.
They also marketed the first webcam, and made emulators mainstream. Their PlayStation emulator is the basis for the case law that says emulators are fair use, decided as a result of a suit from Sony.
I asked ChatGPT directly how it was fair that OpenAI bought 40% of the world’s RAM supply.
It denied this saying that the figures quoted were estimates only, that such massive RAM contracts would be easily obtainable public knowledge and that primarily the recent price increases were mostly cyclical in nature.
Any truth to this?
Edit to add: I am actually curious; I was under the impression that this 40% story going around was true and confirmed, rather than just hyperbole or speculation.
It reminds me of the heady days of Thai floods when hard drives were inaccessible.
I don't think that ever happened. Using relatively sparse amount of memory turns into better cache management which in turn usually improves performance drastically.
And in embedded stuff being good with memory management can make the difference between 'works' and 'fail'.
But in performance work, the relative speed of RAM relative to computation has dropped such that it's a common wisdom to treat today's cache as RAM of old (and today's RAM as disk of old, etc).
In software performance work it's been all about hitting the cache for a long time. LLMs aren't too amenable to caching though.
it's just a cartel cycle of gaining profits while soon eliminating all investments into competitors when flood of cheap ram "suddenly" appears
1. https://x.com/_asadmemon/status/1989417143398797424
Today's RTX 5060 has 8 GB for basically the same price that the 1070 did.
For $650 you can go up to 12 GB in the 5070, if you want 16 GB it's $1000 for the 5070 Ti, or hundreds more than that for the 5080.
I know there's inflation and $380 in 2016 was more money than it is today, but if you'd asked me 10 years ago I would've bet on VRAM capacity doing better than "the same money is worth less but still gets you exactly same amount of memory 10 years from now."
With prices going up, I half expect Nvidia to launch the RTX 6070 and tell everyone "It has 4 GB of memory and we think you're going to love it. $900." Or they'll just stop bothering with consumer GPUs entirely.
Resource usage has been on a hedonic treadmill at least since I came online in the 90s. Good things have come from that, of course, but there's also plenty of abstraction/waste that's permitted because "new computers can handle it."
With so many gaming devices based on the AMD Z1 Extreme platform (and its custom Valve corollaries) over the past few years, it'll be great to see that be the target/baseline for a while. Brings access to more players and staves of e-waste for longer.
I work in gamedev, so perhaps I'm a bit sensitive, and I understand that general purpose engines aren't as light on resources as the handcrafted ones that nobody can afford to make anymore... but we're not anywhere close to the layers of waste and abstraction that presents itself when using webtech for desktop apps by default.
Arguably the connotation has changed slightly, but AI slop caught on because it fit so well.
It's uncommon, and associated with old timey prisons and orphanages.
The word itself has existed for hundreds of years.
Ram will always be in some demand, but that doesn't mean it's viable for everyone to start building production.
1) Prices aren't returning to "normal".
The only way they will is if the hyperscalers and AI companies start to implode -- which will kill a huge portion of the US economy and lead to global recession, so, cheap RAM but nobody can afford it
2) By building up capacity you influence the outcome.
If someone else enters the DRAM space, the duopoly has to actually start thinking about competing on price, maybe they become price competitive before the launch of your new fab in order to kill it, but, it will have an effect and probably before it even opens
3) A western supply chain has benefits by itself.
There's a reason some industries are not allowed to die, most notably farming- because security and external pressure are concerning.
---
Realistically there's no reason not to do this. It will be long, painful and expensive. The best time was a decade ago. The next best time is now.
you're missing the picture that it's not companIES - the crisis primarily was caused by only one company OpenAI buying out wafers
but even more than that - that wafer buyout is *an excuse* used by cartel - there are several mechanisms that could have eased out most of the problem (e.g. Samsung selling old equipment) that was not done to ride the money wave
the winners will not be the ones who build new fabs - but ones who'll have enough money and government subsidies/import taxes to protect such investments after cartel decides to oversupply again, flushing the price down
The biggest problem is that the industry wants HBM, whereas consumers want DRAM. Until the need for HBM has been sufficiently satisfied, fabs will prefer being tooled for HBM because businesses can be squeezed much harder than consumers.
Then again, as consumer you don't really need DDR5 or even DDR4 so long as you aren't using an iGPU. Its all about being around CL15 timings.
Really the only way it could work is if the government declares it it a national security issue and will promise to subsidize it. Because in just a free market, it's most likely to flop.
I disagree.
Modern RAM is made in fabs, which are ridiculously expensive to manufacture. Modern EUV lithography machines cost around 500M each. They're manufactured by hand. Only one company in the world knows how to manufacture them right now. So we can't exactly increase global manufacturing capacity overnight.
The way I see, there's 2 ways this goes:
1. AI is a fad. RAM and storage demand falls. Prices drop back to normal.
2. AI is not a fad. Over time, more and more fabs come online to meet the supply needs of the AI industry. The price comes down as manufacturing supply increases.
Or some combination of the two.
The high prices right now are because there's a demand shock. There's way more demand for RAM than anyone expected, so the RAM that is produced sells at a premium. High prices aren't because RAM costs more to manufacture than it did a couple years ago. There's just not enough to go around. In 5-10 years, manufacturing capacity will match demand and prices will drop. Just give it time.
Which is a good idea for when we don't have a dementia patient in charge of our country.
EU should get on that though.
People forget quickly why we only have a handful of DRAM manufacturers today.
Where can CXMT and other Chinese players export when Japan, South Korea, much of ASEAN, India, much of North America, the EU, the UK, Australia, NZ, and parts of the Gulf have enacted or begun enacting trade barriers against Chinese exports?
[0] - https://www.ft.com/content/eb677cb3-f86c-42de-b819-277bcb042...
Also, I don't think you've seen true consumer rage until the opposition in the EU would start pointing out the current parties are making the smartphones, laptops, TVs and whatnot consumers wanna buy much more expensive (or more crappy). Large parts of the EU are currently being crushed by one of the worst housing crises in the world, the economy seems to be wavering for young people especially, and tech / gadgets being cheap was one of the sole rays of light left.
Or their consumers will enjoy cheap PC part prices. With possible gray zone re-export market.
Of course we could see retreat from global markets to mercantilism, but that has yet to fully happen.
[0] - https://www.reuters.com/world/asia-pacific/xi-putin-hail-tie...
[1] - https://www.reuters.com/world/china/chinas-president-xi-meet...
[2] - https://www.reuters.com/world/china/china-calls-closer-defen...
[3] - https://www.reuters.com/world/china/eu-steps-up-efforts-cut-...
[4] - https://www.scmp.com/news/china/diplomacy/article/3316875/ch...
SK Hynix, Samsung, or Micron don't treat good people well enough to give them taxpayer money.
How?
Most foundries across Asia and the US are being given subsidizes that outstrip those that the EU is providing, with the only mega-foundry project in Europe was canceled by Intel last year [0].
Additionally, much of the backend work like OSAT and packaging is done in ASEAN (especially Malaysia), Taiwan, China, and India. As much of the work for memory chips is largely backend work (OSAT and packaging), this is a field the EU simply cannot compete in given that it has FTAs with the US, Japan, South Korea, India, and Vietnam so any EU attempt would be crushed well before imitating the process.
Furthermore, much of the IP in the memory space is owned by Korean, Japanese, Taiwanese, Chinese, and American champions who are largely investing either domestically or in Asia, as was seen with MUFG's announcement earlier today to create a dedicated end-to-end semiconductor fund specifically to unify Japan, Taiwan, and India into a single fab-to-fabless ecosystem [1]. SoftBank announced something similar to unify the US, Japan, Malaysia, and India into a similar end-to-end ecosystem as well a couple weeks ago [2]. Meanwhile, South Korea is trying to further shore up their domestic capacity [3] via subsidies and industrial policy.
When Japanese, Korean, and Taiwanese technology and capital partners are uninterested in investing in building European capacity, American technology and capital partners have pulled out of similar initiatives in Europe, and the EU working to ban Chinese players [4] what can the EU even do?
----
Edit: can't reply
> Why are you overlooking European semiconductor champions
Because they don't have the IP for the flash memory supply chain. And whatever capacity and IP they have in chip design, front-end fab, or back-end fab is domiciled in the US, ASEAN, and India.
> STMicroelectronics
Power electronics and legacy nodes (28nm and above) for IoT and embedded applications.
> Infineon
Power electronics and legacy nodes (28nm and above) for automotive applications.
> NXP
Power electronics and legacy nodes (28nm and above) for embedded applications.
> All of them are skilled enough to build and operate a DRAM fab in Europe. A bunch of EU dev banks can lend the monies to get it built.
They don't have the IP. Much of the IP for the memory space is owned by Japanese, American, Korean, Taiwanese and Chinese companies.
Additionally, most Asian funds own both the IP and capital (often with government backing), making European attempts futile.
Essentially, the EU would have to start from scratch and decades behind countries with whom the EU already has FTAs with that have expanded capacity well before the EU and thus would be able to crush any incipient European competitor.
[0] - https://www.it-daily.net/shortnews-en/intel-officially-cance...
[1] - https://www.digitimes.com/news/a20260224VL219/taiwan-talent-...
[2] - https://asia.nikkei.com/economy/trade-war/trump-tariffs/soft...
[3] - https://www.digitimes.com/news/a20251230PD220/semiconductor-...
[4] - https://www.ft.com/content/eb677cb3-f86c-42de-b819-277bcb042...
It's easy to build factories, much more difficult to train the engineers required to run them... and let's not even talk about all the crazy regulations & environmental rules at the EU level that make that task even more difficult, because yes, chip factories do pollute... a lot.
Countries like South Korea or Taiwan have adapted all their legislations and tax, environmental regulations to allow such factories to operate easily. The EU and EU countries will never do that... better outsource pollution and claim they care about the planet...
The reason is as you have described. We are getting close to where the numbers of people with practical experience working in, managing, and designing things like the work processes and factory layouts in industries that build physical products are disappearing. We're losing a lot of capable practical engineers with hands on experience. We can keep the universities going teaching the physical subjects but those lecturers wouldn't know even where to begin on designing and building efficient factories unfortunately.
We'd probably end up having to get Chinese and Taiwanese businesses to outsource their 'experts' back to us in order to actually do this and pay them a fortune - basically the reverse of what was happening in the manufacturing sector in the 80s and 90s!
The same applies to your comment.
So, we're looking at a decade-long project at least, even if everything goes as planned, and crazy fast, in the technical and administrative departments.
Excellent universities, overall. But results from primary and secondary schools are nose diving at a more than alarming rate in several EU countries. Literacy rates are falling, math grades are falling. There's IMO only so much time before universities begin to be affected as well.
Well, the EU has not manufactured a whole lot of chips in the last 30 years, where do you get the people with the professional experience to teach new engineers... Oh you mean you have to import the teachers from South Asia too? /s and it takes what, 5 years at the minimum to train an engineer? France and UK used to produce entire home computers... in the 80's...
This is not comparable to Taiwan or the Shenzen area, but it's definitely not nothing. Some local expertise exists, even though it may be not the most cutting-edge.
Apple could lead here. They sell feels not specs so they could down OS and Browser RAM requirements and sell lower RAM entry models.
On the flip side if you're buying a new computer in 2026 - it's going to be even harder to justify not getting a MacBook, the chips are already 2 years ahead of PC, the price of base models was super competitive, now that the ram is super expensive even the upgraded versions are competitive with the PC market. Oh and Windows is turning to an even larger pile of shit on a daily basis.
I'd buy a mac in a sec otherwise.
If Apple fully supported the Asahi Linux project, I 'll switch in a heartbeat.
https://www.theverge.com/tech/880812/ramageddon-ram-shortage...
They discussed it on the decoder podcast as well.
Additonally, depending on which country you live in, telecom vendors reduce the upfront cost of the phone purchase and make up the difference via contracts.
Unless there is a true breakthrough, beyond AGI into super intelligence on existing, or near term, hardware— I just don’t see how “trust me bro,” can keep its spending party going. Competition is incredibly stiff, and it’s pretty likely we’re at the point of diminishing returns without an absolute breakthrough.
The end result is going to be RAM prices tanking in 18-24 months. The only upside will be for consumers who will likely gain the ability to run much larger open source models locally.
They also marketed the first webcam, and made emulators mainstream. Their PlayStation emulator is the basis for the case law that says emulators are fair use, decided as a result of a suit from Sony.
So why you’re saying is that it could be worse, but not by much?
https://downloadmoreram.com
Idk if the owner changed or what, but the website used to be more comical.
It denied this saying that the figures quoted were estimates only, that such massive RAM contracts would be easily obtainable public knowledge and that primarily the recent price increases were mostly cyclical in nature.
Any truth to this?
Edit to add: I am actually curious; I was under the impression that this 40% story going around was true and confirmed, rather than just hyperbole or speculation.