I think it's a bit rich to describe this as the 'future of video game preservation'.
The MiSTer project https://github.com/MiSTer-devel/Wiki_MiSTer/wiki more rightfully deserves that title. It's got a huge range of systems (across consoles, arcade and micro computers) and it's all GPL licenced. The base board is a Terasic DE10 Nano which is proprietary but all other hardware required is open source.
The MiSTeX project aims to make MiSTer portable across different FPGA platforms https://github.com/MiSTeX-devel so a DE10 Nano won't be mandatory enabling a new ecosystem of open hardware and commercial for profit solutions.
I take no issue with people wanting to make money in this space. I take great issue with trying to gatekeep system preservation behind a mostly closed system you stamp an 'open' moniker on.
Five years ago the Mister people were similarly unable to STFU about raspberry pi’s. Mister is still emulation, it still has tons of issues, and it’s still $700 invested before you’re actually playing games. Meanwhile we’re in a golden age of RGB mods, flash carts, and optical drive emulators for original hardware.
While I'm a strong supporter of your position that the MiSTer (and any FPGA console implementation) is still emulation, it's worthwhile to keep in mind that this will indeed be _a_ future for preservation (unsure about it being the only, and warmer to, but not sold on it being the best way).
Original hardware is great, but it's getting older, and failing. The Super Nintendo / Super Famicom, for instance, uses a ceramic oscillator as a clock source for its sound processing unit, and as the console is designed, the sound clock is essential for keeping the entire system in sync. Members of the Tool Assisted Speedrunning community have been experimenting with parts replacements to resuscitate this clock source with only mixed successes. This is one of many pieces of silicon on these old platforms that will continue to fail as time goes on.
We can't make brand new classic consoles because the parts are just unavailable, and the industry has a well trodden path of reimplementing discrete electronics in FPGA for decades now with great success. If your goal is to keep playing games for a while longer on your OG console, then heck yeah, go for it. It does indeed do the thing. But if we want to be able to preserve this history in as close to the original form factor and play environment as possible, then we have to explore other options, and FPGA system cloning is a grand way to do this.
I think the most worrying thing about emulating old consoles by reverse-engineering them is that you're running out of time to test it. Say for example that the original Luigi looks blue in the hardware it was made for, but in an emulator he looks green. If there are no original NES left, you will never be able to know the emulator's code doesn't match how the NES actually worked. From that point on, the emulator experience is as canonical as it gets.
The MSRP of the de10 nano is $225, but they are commonly out of stock still. The going price for a scalped de10 nano is over $300, so all is you are likely looking at $400 to $500. I kick myself often for not getting multiple de10 nanos earlier (there are more projects that used them than just mister).
There's no circuit-level reproduction happening anywhere yet, it's basically emulation with a hardware clock. Analogue and kevtris know that FPGA isn't magic. Rather than contribute to the knowledge pool, like the MiSTer project, they're being bad actors and trying to muddle the situation to make money off of it. Near was trying to clarify the situation, but now we see where good intentions gets you.
Perhaps you are referring to this video  by What's Ken Making which goes into detail about the advantages/disadvantages of both software and hardware emulation?
At 13:43, he demonstrates a couple of snes games with effects that don't work well in software emulation (along with what the effects / game mechanics should be on original hardware). At 22:25 he shows these same games running on FPGAs (analogue pocket, mister) and behaving correctly, just like on original hardware.
As an aside, if this isn't the one you were referring to, it's still an awesome video by an amazing creator; I'm a big fan of this stuff after discovering it on accident. He gives a nice introduction to FPGAs in this same video.
The argument being made is that the distinction between an FPGA and CPU is irrelevant to determine whether something is emulation.
An emulator accelerated by an FPGA is still an emulator and a CPU simulating a console at the circuit level is no different in accuracy than an FPGA doing the same thing.
The main difference is the experience, where the FPGA can offer an experience closer to the original than a CPU running an operating system can, especially for older consoles. Not because of a difference in emulation accuracy, but because the user experience is different.
That’s just building a unit to spec using an fpga instead of a chip fab. If it’s a 1-1 copy, emulation would not appear to be the correct term.
If I have a black box chip that takes in 3 bits and spits out 3 based on the input. If I make a bit perfect implement of it, but my chip has additional outputs that correspond to unused input, hence never used. Is it still considered emulation? It differs minimally in a negligible manner.
2: "hardware or software that permits programs written for one computer to be run on another computer"
Yup, FPGA recreating other hardware definitely fits the bill of emulation. Granted, Analogue has played up a marketing spin that boxes in "emulation" to mean "software running on some generic CPU" in order to claim "no emulation", but it's just marketing doing the work of marketing.
It depends on your interpretation of the definition. If the two "computers" that the webster definition is referring to have a different architecture, then Wine does not fit that definition, because Wine relies on the fact that it's the same computer architecture, namely an x86 PC.
If you combine Wine with Rosetta to run Windows programs on an M1 Mac, then it is definitely emulation by any definition. But then it's not Wine that's doing the emulation, it's Rosetta.
It doesn't implement the original hardware. It's not copy of the original chip. Reverse engineering is used to get the behavior of the hardware. They reproduce the inputs and outputs with Verilog and write it to an FPGA.
If it weren't emulation, they would decap the original chip and directly copy the chip, gate-by-gate. It could be done, but it hasn't been done that way anywhere to date.
At the end of the day both FPGAs and software emulators are Turing machines that produce a set of outputs given a set of inputs : any logic an FPGA can implement can also be implemented in software, it's computer science 101
FPGAs aren't magically more accurate. That is only up to the programmer and what effort they put in.
The main difference is efficiency and parallelism : it's much easier to reliably produce cycle-accurate parallel outputs in real time with an FPGA, compared to software running on a multi-tasking OS with many layers of abstraction and no deterministic real-time guarantees.
But, as a single-core processor can fake multitasking, by slicing time between processes (preemptive multitasking), software emulation can mimic parallelism if the host is beefy enough compared to the old school system it's emulating.
The larger the performance / clock speed gap between the host and target, the more indistinguishable from a truly parallel FPGA an emulator can be.
Software emulation also has practical advantages for developers : while FPGAs force you to painstakingly implement every bit of functionality at the logic gate level, with software you can start off with a much higher level model of the target system that's much easier to implement, and mix & match that with more precise low level simulation where it matters.
The time this frees up (+ the availability of various libraries) allows the developer to spend more time researching the original system and adding modern quality of life features that just wouldn't be possible otherwise.
Nobody has put these chips under a scanning electron microscope, catalogued each gate, and then recreated those in extremely verbose Verilog.
That's what "implementing the target hardware" in an FPGA would consist of.
Even if this extremely-arduous process were to take place, you would likely still be unable to reproduce the analog chips used for audio and video in most FPGAs - certainly not the cheap FPGAs MISTer and friends use.
> Nobody has put these chips under a scanning electron microscope, catalogued each gate, and then recreated those in extremely verbose Verilog.
I don't think anyone was doing that when iterating on the hardware back in the day, either. Dollars to donuts, the 65C02 was not based on a deep empirical analysis of the behavior of as-implemented 6502s, but was rather produced from modifications to the design spec that the original 6502 was itself implemented from, with the intent of maintaining compatibility with the original design.
Same thing here. An FPGA implementation of the original schematics of the hardware can be viewed as another instance of variant hardware built against the original design, whereas software emulation is simulating the outward behavior of the original hardware without any ability to be implemented directly from the original design at all.
> Even if this extremely-arduous process were to take place, you would likely still be unable to reproduce the analog chips used for audio and video in most FPGAs
Which returns us to the original description of these solutions being mostly hardware implementation via FPGA, but not entirely, as some emulation is still needed.
That’s how I see it too. The FPGA version are knock offs, like would have been done back in the day but with fancy future hardware. A gate level reproduction would be kinda strange aside from pure desire for preservation. (Which is still a decent goal, just not of these projects)
The FPGA is used to implement an approximation of the original hardware’s functionality - not a perfect clone of the original hardware itself, which means they have their own unique bugs and inaccuracies compared to the original designs.
I think “emulation” fits as a reasonable description of this, personally - at least as something an average person will understand.
Wow, that looks useful! Out of curiosity, I scanned the latest Windows release (4.5.0) with VirusTotal and it reported several malware hits. I realize I could manually audit the source code and build an .exe from scratch but do you think the release hosted on GitHub is malicious?
I wish people would stop treating FPGAs as the Second Coming of the Lord or whatever. It is really not.
It's emulation, plain and simple. Not bettero or worse than software emulators. It usually lags behind the pure software emulators because there are fewer devs and because emulating stuff in hardware is harder than emulating stuff in software.
Just because it's in hardware doesn't mean that it is "better" or "more accurate".
The main advantage of FPGA emulation is concurrency. When you’re emulating a piece of hardware with multiple chips in software, you’re often forced to run the emulation in batches (i.e. run the CPU for 10 cycles, then run the video chip for 10 cycles, then run the audio chip for 10 cycles) for performance reasons. This matters because some games require a higher level of timing accuracy to (for example) paper over bugs in the game code, or perform fancy graphical tricks. There are cycle-accurate software emulators, but they aren’t really playable for consoles after the 16-bit era and require relatively powerful CPUs. FPGAs allow you to run multiple chips in parallel, which eliminates this issue, allowing for accurate emulation in a small battery-powered handheld device.
GPU are for data data parallel operations, which does not help for emulators which are (for the most part) running much slower traditional CPUs and a handful of support chips all at the same time, which is not data parallel.
For pretty much all 8-bit home computers cycle-correct emulation is essential, without it most modern scene demos simply don't work, but also a lot of old games (although those are usually more forgiving).
In later computer architectures, hardware components have become more and more decoupled from each other, running on separate clocks and busses, with caches and buffers inbetween and what not, all of which makes timing less predictable but also less important in emulation, giving the emulation much more slack when it comes to "synchronicity" (which ironically makes modern computer systems "easier" to emulate than older systems - at least when it comes to correct timing).
But 8-bit home computers (and also the early 16-bit systems like the Amiga and Atari) were essentially a single 'mega-chip' all running off the same clock and all chip timings being deterministic, which was and is exploitet by software.
"cannot" isn't accurate. "cannot easily" would be accurate.
For example, FPGAs themselves can be emulated in software with precise timing accuracy, so anything you can run on an FPGA can be emulated in software without the FPGA. It's difficult and sometimes infeasible with current CPUs to get up to the same speed though. This depends a lot on the FPGA circuit being emulated.
Source: I used to work on an FPGA-targetting hardware compiler and accelerated FPGA circuit simulators.
The main advantage is it makes it possible to do more accurate emulation without needing to sacrifice performance in the same way as software emulators. But the very large gap between affordable CPU performance and affordable FPGA performance makes it not an obvious tradeoff.
Whilst it's true FPGA doesn't inherently imply accuracy it does make it simpler to recreate things in a more accurate way, in particular around interactions between CPU,audio and video. It also enables accurate input latency with respect to these things.
When you look at typical FGPA emulator source code, it often looks pretty close to software emulator code ported to VHDL/Verilog instead of an attempt to re-create the original reverse-engineered 'transistor-level design' which would automatically reproduce any 'undocumented behaviour' of the original chip (like http://www.visual6502.org/JSSim/index.html)
As such, an FGPA emulator isn't necessarily any closer to the original hardware behaviour than a software emulator. I guess the main advantage of FGPA is better performance on lower cost hardware.
> As such, an FGPA emulator isn't necessarily any closer to the original hardware behaviour than a software emulator. I guess the main advantage of FGPA is better performance on lower cost hardware.
The price of the majority of FPGA chips are really expensive compared even against cheap (yet still way more powerful) SoCs (like really cheap ARM CPUs that are used in sub $100 handheld emulators). Also, technically a FPGA based emulator could be more efficient compared emulating everything in software, but AFAIK even Analog Pocket is not really that better in battery life compared to say a Miyoo Mini + (maybe because a ARM SoC have better energy management, but I don't know).
I think really the main hype of FPGA is lower input latency, that is really difficult to archive with software emulation. There are still some tricks you can do in software that reduces the input latency significantly, but they generally are expensive to compute , so it wouldn't be feasible to be done in a cheap handheld device (at least yet).
Many people think an FPGA is automatically accurate emulation because it's "hardware" and the original console is "hardware".
But the FPGAs are based on C software emulators because that's where all the knowledge in the world of how to emulate the original system is kept. You can't translate original hardware to Verilog and skip the figuring out how it works process.
You're getting into the weeds and will get different answers based on semantics, here.
A FPGA emulation isn't inherently better than a CPU-based emulation of any given chip; it's not more authentic because it still lacks the particular quirks of any old CPU that are associated with the way the hardware was laid out, path lengths, imperfections etc.
I'm glad it's introducing FPGA programming to a wider audience because FPGAs are probably going to become more important going forward - and are probably going to be what keeps Intel alive - but it doesn't make the emulator inherently better.
I recently bought an Analogue Pocket and I think it's a great piece of hardware, but I'm really not a fan of this company's business model. This page has told me nothing I actually want to know about OpenFPGA.
Here's my question: If I'm developing an FPGA core, why should I develop for OpenFPGA instead of MiSTer? I want to know why it's better for preservation that I develop for OpenFPGA. Is it a more portable platform that has a more guaranteed future? I need to be convinced that OpenFPGA solves problems that make it a more likely choice than MiSTer 10+ years from now.
If someone with more experience than me in this space can answer the above, I would be really grateful. I'm astounded that Analogue's page on OpenFPGA is all marketing fluff without actually answering this.
Even if this were the future of emulation, we need a lot more than emulators for anything to really be considered the "future of video game preservation".
Modern games so frequently require connections to servers - often needlessly. For example, Ubisoft is shutting down the servers for 2014's The Crew on March 31, 2024. That's less than 10 years after release. When the servers are shut down, the entire game will stop working - including the lengthy single player campaign. No emulator will bring that game back from the dead.
In a similar story, Square Enix recently announced that they will be shutting down Nier: Reincarnation's servers in April 2024, less than just 3 years after its worldwide release in 2021. And yes, it's a single player game. My wife is a fan of the Nier franchise and has been playing Nier: Reincarnation since it's release. In a couple months she'll never be able to play it again. Square also pulled the plug on Babylon's Fall in 2023, less than just 1 year after it's 2022 release.
If you're concerned about game preservation, be prepared for the disaster coming soon. FPGAs won't be enough.
While this is being done intentionally now, it has been an issue for over a decade. - And it hasn't kept people from preserving these games in the slightest.
Even now, there is a lot of work being put into reviving games that required master servers etc. by means of reverse engineering and essentially cracking. I do however agree with you that this is not the way it should I be and I still think developers and publishers should be legally bound to either allow software that they sell to function indefinitely or release the server code when they shut down the official servers.
I think "in the slightest" may be a bit of an exaggeration. Not every game with a required server connection has been resurrected. And as this problem continues to escalate, I'm doubtful that hackers and preservationists will be able to keep up.
I play Overwatch with my sons and not only can you not run a personal server, the game frequently has fundamental mechanic changes for certain characters (and all characters in this Tuesday’s update). The Overwatch of a year ago is a very different game than is now or will be a year from now.
If someone was to provide an Open Overwatch server, I don’t even know what it would look like at this point since game clients aren’t available for particular versions (maybe on PC, but not console). When Microsoft is done with Overwatch it’ll be gone.
I actually didn't count live service games like Overwatch in this category, to be honest.
Overwatch is deliberately designed as an ever evolving service instead of a product you purchase. With games like these, there is no way to ever archive them, since there is no canonical state in which the game remains for any sensible amount of time.
I understand that this is a cause of frustration and that there's a lot of issues with this but I'd keep those separate from one-time-purchase products which fortunately is still what games mostly are [...for now].
> Square Enix recently announced that they will be shutting down Nier: Reincarnation's servers in April 2024, less than just 3 years after its worldwide release in 2021. And yes, it's a single player game.
Hmm? The Nier games are always online? I keep meaning to try them (although by now i probably have to buy used discs). You're saying there's no point any more?
So, what happens if this particular FPGA no longer can be bought?
Isn’t it more likely it will be possible to compile and run current C code on 22nd century hardware (possibly on some virtualization solution) than that it will be possible to compile and run FPGA code on 22nd century hardware?
I agree. For me, one of the challenging long term parts of video game preservation will be non-standard controllers and other peripherals, such as the Wii remote, Wii balance board, Guitar Hero/Rock Band guitars and drums, etc. Sure, technically you can use a mouse and keyboard for some of those, but it’s fundamentally an entirely different experience from the original.
The Wii came out over 17 years ago and third party companies still make controllers for them. Maybe that will still be the case 40 years after release, but eventually nearly all Wii consoles will stop working. Will there always be a big enough market of people playing Wii games on emulators to justify making those controllers and peripherals? I hope so, but am not sure. There near certainly won’t be enough demand for 3rd party Wii balance boards, which I can honestly live with, but I do hope the main controllers themselves are still available for purchase or possible to make out of other hardware available in the future.
I agree, this is tying the video game to a more recent hardware, which will also disappear in its own time. Much better to have a full software emulation that can be ported / recompiled to newer CPUs and new OS.
I’ve pasted the specs below, but in my opinion the biggest difference is that the OpenFPGA - in its Analogue Pocket form - is an end-user friendly target. MiSTer is more “enthusiast-friendly” with more options and upgrades (including an recommended add-on to the basic kit).
MiSTer “tech specs”:
Intel/Altera Cyclone V SE (5CSEBA6U23I7) FPGA SoC with 110,000LE (41,500ALM) and 5,570Kbit of Block RAM.
ARM Cortex A9 dual-core CPU at 800MHz.
HDMI video and audio allowing easy connectivity to any modern monitor/TV.
1GB of DDR3 RAM that is directly available to both ARM and FPGA.
High-speed ARM <-> FPGA interconnect due to both being in the same chip.
Intel/Altera Cyclone V FPGA
49K logic elements and 3.4Mbit BRAM
Intel/Altera Cyclone 10
15K logic elements
2x independently addressable
16MB cellular RAM
(128Mbit x 16)
32MB low latency memory
1x synchronous DRAM 64MB
(32Mbit x 16)
The Pocket has an FPGA from the same series as the MiSTer, but it's a lower-end model that's optimized more for power consumption. Most notably, it's missing the 800 MHz ARM core (meaning that it runs a custom "OS" off of a microcontroller rather than Linux like the MiSTer), and it has a bit less than half the amount of logic elements. This means that 16-bit console emulation is about the highest you can go on the Pocket, while the MiSTer can emulate the Playstation, Saturn, and N64.
The cores that are community contributed can be open source. The hardware and tooling are not open, so it doesn't meet most people's definition of open. On the spectrum it's much closer to being open than closed source emulation products that first parties distribute.
Is MiSTer considered open? It's dependent on a closed source toolchain to generate the bytestream.
No. You need to explicitly request the tooling from the for-profit company (Analogue, which wraps Intel/Altera) so that the open source cores you write can run on their closed source hardware. It sounds exactly like the situation with MiSTer, except that there is a second for-profit middleman. Afaik MiSTer cores can't run on open source hardware or compile on open source toolchains.
It's really a matter of taste whether you like Intel more than Intel+Analogue. It's wrong to call MiSTer an open source project but openFPGA not just because the number of for-profit companies needed is different.
Note ECP5 and Nexus support is similarly good by now, and these families have larger FPGAs better suitable for miSTer cores.
There's now also some support for GW1N/GW2N FPGA families from China. These have the advantage of being relatively quite cheap, and I hear they have FPGAs with 250k+ LBs coming, with RISC-V hard cores built in.
I am surprised no one has yet tried to make a portable shell for the DE-10 Nano (Mister-FPGA's target platform). With a power usage of 10 watts, I think it should be quite feasible even with the addition of a small LCD panel. However, it would probably require re-engineering some of the standard addon boards for the form factor.
Though sadly it's remained a prototype. I think the issue they had was you needed to mod the DE10 Nano (remove the pin headers and I think the ethernet jack) to fit it into the portable form factor. So they had concerns with either selling it in kit form where people had to do the mods themselves to expensive boards or doing it in house and effectively reselling modded DE10s.
A fully custom board would be the solution but then price is a big issue. The DE10 Nano is very cheap vs its BOM cost at catalogue prices.
I purchased the Analogue Pocket and it’s a great little device. Got an archive of all GB and GBC games and can play them off an SD card using some of the FPGA cores available. while expensive, it’s great to have that tactile feel just like an original gameboy with modern quality of life features
Just a reminder that Analogue have made very little on their promises to those they stole time from.
Analogue isn't even the only ones to do an FPGA rebuild of a console; there's a near-perfect GB/GBC clone (with quirks toggles) that fits into a traditional Gameboy shell: https://funnyplaying.com/products/fpgbc-kit
> The next conversation unfortunately only brought more red flags. The first hint of impacting mGBA development had dropped: suddenly they were talking about delaying an mGBA release for a nebulous amount of time, directly contrary to what had been discussed prior. [...] The next conversation was suddenly about delaying until after the Pocket was released. At no point was such a thing discussed prior, but it was worded like it was. This was explained as putting a little bit of extra time after the release, though the reason was left implied; presumably they didn’t want mGBA stealing the Pocket’s thunder, as though that were at all a realistic scenario. And the amount of extra time proposed? Six months.
> By now it was clear to me that they didn’t respect me at all and there was no truth to the claim of the job not impacting my open source work. It all seemed to point to them seeing me as a source of cheap labor and then didn’t care at all how it impacted me, so long as I did the work for them. [It] really reflects on how little Analogue seems to actually care about the retro emulation community as a whole. In conversations with other emulator developers over the past it was spelled out that kevtris thinks of FPGA-based hardware emulation as inherently superior to software emulation, and is plenty willing to keep research he does towards the goal of perfecting his hardware solutions private, all while claiming that not only is it not even emulation (with an asterisk of course), it’s also the only route to perfect emulation. Neither of these claims is true.
> When I asked kevtris if he would release all of the documentation he had on GB/GBA he had said yes, after the Pocket shipped, but I’ve yet to see him release any of the documentation he’d promised for other projects, such as SNES, which have had products on the market for years now. The most I’ve seen is extremely basic overviews of a handful of obscure GBA behavior that, while valuable, is assuredly a tiny fraction of what he has.
The emulator writer scene lives on back-chatter. Analogue isn't even the only one... ask some developers what they think of RetroArch, who simply bundle up emulator cores... You won't get pretty answers.