Can someone explain this to me? I recently wrote some music for a friend’s 8-bit video game. I did what any sane composer would do and composed it using a DAW and my synths. I then made a MIDI file and brought that into the tracker app that translated it into Z80 assembly. The app I used was Arkos Tracker 2 which appears to be a direct UI translation of an old Atari music app. I can understand it having such an obtuse interface back when you used it on an 8-bit Atari computer, but why on earth would anyone start writing music in such an app today when there are free tools like Garage Band (and I’m sure equivalents on other platforms) that allow you to work like a musician instead of a programmer. (And I say that as a programmer.) It would be like a word processor that required you to enter your text as the hexadecimal values for the ASCII characters you want. I get having tools to tweak the generated code, but why not have a musical interface as the main way of entering your musical data? That said, I thoroughly enjoyed working within the constraints of an 8 bit platform. It was a challenge and a joy!
Some musicians think like programmers and writing with characters in columns and rows is as comfortable as writing notes on a staff. That’s why there are trackers that output MIDI events and even a tracker based DAW (Renoise.) I’ve known musicians who considered their own use of trackers to be a kind of ‘cheating’ because it was so much more comfortable to use a tracker and churn something out vs say, performing and fixing it in a piano roll.
For some platforms, tracking is also much closer to what is actually happening with how sounds are triggered versus the effects vs the event based aspect of MIDI.
As a person split 50-50 bt music and coding, I, too have pondered this... there are several factors: Most important are simply the resource constraints... traditionally the songs need to play on a real machine that only dedicates a few cycles to music chip register updates. We don't generally need a sidtune that couldn't play on a real c64. So an editor shouldn't synthesize a set of calculations so abstracted and convenient that it overloads the CPU. 2nd is simply tradition. These sidtrackers tend to imitate each other to benefit both users and developers. 3rd is that modern composition paradigms employ modularity and composition of modules such as LFO->Sample and Hold, buffer effects, or wild polyphony. No one has really dared extend these affordances to SID code that could run on a real c64. Instead folks with these urges steer into using a DAW with sid plugin for convenience.
Nevertheless i do find it tedious to have to recall the bitfields of the waveform register and wish often for an editor which minimizes such necessity. After griping to thr authour of Sidfactory II, i learned his sid editor at least fixes up wavetable branch commands when instruments lower on the table insert extra steps...
None of that addresses my question. They could still have the software emulate only what a C64 SID can actually do, but allow you to enter notes on either a musical staff or at least a piano roll. Or heck, even a step sequencer. That seems infinitely better than entering hex digits into columns.
Well you know, whatever interface you use to input, it's an abstraction over the "hardware". Even classical notation on paper is actually not that great an abstraction over physical instruments - instruments can make plenty of subtle noises that the notation can't hope to capture, and which you have to rely on a performer to intuit.
With programmed music, as you know, you don't get that. What you enter is what you get. You can hope that your DAW gives fantastically good defaults for all the subtle performance features, or even an interface to let you tweak them.
And here's the clue to those interfaces. They are very close to the hardware of the instrument (the SID chip) and the "performer" (the playback routine). When the SID tracker comes with a sound table and a filter table, for instance, that's because it's a very close mapping on how you make sound with this thing. Sure, they can always write "triangle wave" instead of $11, but if you know the instrument, you know that $11 is triangle wave anyway.
You know the difference between a typical composer working in a DAW writing a clarinet part, and a clarinetist working with a DAW writing a clarinet part. It's not that the former can never do a good job, it's just that Floex will in all likelihood do it better. And in the same way, you'd need to understand the chip as well as LMan, Mahoney or LFT do in order to get the best out of the instrument.
Usually to get the best out of the SID the music and the instruments need to be hand optimised to work within SID constraints. This usually means being able to edit the bits/values sent to the SID hardware, which is quite a low level task.
This MusicStudio tool does allow MIDI files to be imported for final tweaking and optimisation.
I'm a big fan of Deflemask, but this does look cool. Will have to have a play later. Sounds fantastic, and I really do love the low level looking interface. Hoping it allows "poking the pins" directly, but will need to have a proper look this evening. Great work!
SIDPLAY is a Commodore 64 music player. It emulates the sound chip and other internals of the Commodore 64 home
computer to play back music that was originally written on it. The sound chip of the C64 is called the Sound
Interface Device or SID, and therefore this music is known as SID music. The SID chip is basically a three voice
synthesizer on a chip. The unusual combination of digital and analogue circuitry is the reason for its
If native is better, there's always the option of a program running on the C64 itself. I know Blackbird is good, and SID-Wizard is also quite popular. The ergonomics aren't as bad as you'd expect...or, not as bad as I would expect anyway.