- tarantool https://www.tarantool.io/en/
- rebol/red-lang https://www.red-lang.org/
- U++ : https://www.ultimatepp.org/
- lazarus: https://www.lazarus-ide.org/
- fasm: https://flatassembler.net/
- tarantool https://www.tarantool.io/en/
- rebol/red-lang https://www.red-lang.org/
- U++ : https://www.ultimatepp.org/
- lazarus: https://www.lazarus-ide.org/
- fasm: https://flatassembler.net/
118 comments
I don't mean just vacuum tubes or even electronics at all. Mechanical analog computing is insane when you get down to it. You have special shapes that move against each other and do calculus.
We make these mechanical models as analogs of more complex physical systems. We can turn huge calculations into relatively simple machines. That we can roll two weirdly shaped gears together and get an integral out says to me something very profound about the universe. I find it to be one of the most beautiful concepts in all of the sciences.
What's even more wild is that we can take those mechanical analogs of physical systems and build an electronic analog out of vacuum tubes. That vacuum tubes work at all is just completely insane, but it's some absolutely beautiful physics.
And yes, there are equally beautiful problems that can only be solved in the digital domain, but it just doesn't speak to me in the same way. The closest thing is the bitwise black magic like fast inverse square root from a special constant and some arithmetic. Besides, that's more a property of number systems than it is of digital computation.
I understand how and why digital took over, but I can't help but feel like we lost something profound in abandoning analog.
The tide height is a function of the earth/sun/moon systems. Earth and Moon aren't at a fixed distance from eachother, and neither is the sun, so every day is a unique tide but you can predict the range.
The analog way to do it is to make a gear for each point of data in the system and synchronize all their gears. Then you use them all to rotate one final gear, which will show you the prediction for the time you've chosen.
[1] https://zapatopi.net/kelvin/papers/on_the_age_of_the_suns_he...
[2] https://www.youtube.com/watch?v=IgF3OX8nT0w from about 3 minutes.
Once you really understand how these systems are an analog of a physical problem, everything makes so much more sense
Honestly it seems like a perfect application. Neural networks are analog systems. An analog computer can represent neurons very accurately and the entire network is inherently parallel, for free!
I can't wait to see what comes out of this research
Also, most people don’t know that the word „Analog”, as in „analog circuits” comes from „analogy”.
In today's world, we still build analogs, we just coerce them into strictly numerical, digital models. I don't know if you can call it better or worse, but digital is definitely less magical and wondrous than mechanical analog systems.
Imagine writing a program if every time you wanted to change something you had to cut a new gear, or design a new mechanism, or build a new circuit. Imagine the sheer complexity of debugging a system if instead of inspecting memory, you have to disassemble the machine and inspect the exact rotation of hundreds of gears.
Analog computing truthfully doesn't have enough advantages to outweigh the advantage of digital: you have one truly universal machine that can perform any conceivable computation with nothing but pure information as input. Your application is a bunch of binary information instead of a delicate machine weighing tens to hundreds of pounds.
Analog computing is just too impractical for too little benefit. The extra precision and speed is almost never enough to be worth the exorbitant cost and complexity.
Neural networks are a very good application for analog computing (imo). You have a ton of floating point operations that all need to happen more or less simultaneously. And what are floating point numbers if not digital approximations of analog values? :)
This can be implemented as a network of transistors on a chip, but driven in the linear region instead of trying to switch them on as hard as possible as fast as possible. Which is, I believe, what researchers are trying to do.
There are also some interesting ideas about photonic computing, but I'm not sure if that's going anywhere.
A few months back, someone on YouTube attempted to design a mechanical neural network as a single 3D printed mechanism. It ended up not working, but the concept was pretty solid.
Perhaps that's only because we haven't begun to understand analog yet. And our crude original perceptions have long suffered for being ignored. For example, I have yet to actually hear any digital music yet ... that didn't have to pass through a DtoA converter. Hell, we may even learn that braining is not really the product of individual neurons at all, but a coordinated ballet oscillating like seaweed. I'll go bigger: is consciousness analog?
Both domains are extremely well understood. Analog electronics is an incredibly deep field, and forms the foundations of basically all of our electronic infrastructure. For instance, the transceivers that power cell stations are all analog and are incredibly complex. This stuff would seem like alien magic to anyone from even 30 years ago. The sheer magnitude of complexity in modern analog circuits cannot be overstated.
As for analog computing, well, it's just math. We can design analog computers as complex as our understanding of the physics of the system we want to model. There's not really any magic here. If we can express a physical system in equations, we can "simply" build a machine that computes that equation.
> I have yet to actually hear any digital music yet ... that didn't have to pass through a DtoA converter.
This is simply not true. There are plenty of ways to turn a digital signal into sound without an intermediate analog stage. See PC speakers, piezo buzzers, the floppotron. You can also just pump a square wave directly into a speaker and get different tones by modulating the pulse width.
The reason we use an intermediate analog stage for audio is because direct digital drive sounds like total trash. I won't go too much into it, but physics means that you can't reproduce a large range of frequencies, and you will always get high frequency noise that sounds like static.
Edit: I didn't notice your username before. All 8 bit systems make heavy use of the square wave voice, which is a digital signal. But it's typically passed through an analog filter to make it sound less bad. Music on e.g., the first IBM PCs was purely digital, played through a piezo beeper on the motherboard.
The strength of digital is that your logic is implemented as information instead of physical pieces. Your CPU contains all the hardware to perform any operation, and your code is what directs the flow of information. When you get down to bare basics, the CPU is a pretty simple machine without much more complexity than a clockwork mechanism. It's an extremely fascinating subject and I very highly recommend Ben Eater's breadboard CPU videos on YouTube. But I digress.
The real trick is that digital computers are general purpose. They can compute any problem that is computable, with no physical changes. It's purely information that drives the system. An analog computer is a single-purpose device[0] designed to compute a very specific set of equations which directly model a physical system. Any changes to that set of equations requires physical changes to the machine.
[0] general purpose analog computers do exist, but generally they're actually performing digital logic. There have only been a few general purpose true-analog computers ever designed AFAIK. See Babbage's difference engine.
Most DNA errors turn out to be inconsequential to the individual. If a cell suffers catastrophic errors during reproduction, it typically just dies. Same for embryos, they fail to develop and get reabsorbed. Errors during normal RNA transcription tend to encode an impossible or useless protein that usually does nothing. Malformed RNA can also get permanently stuck in the cellular machinery meant to decode it, but this also has no real effect. That transcriptase floats around uselessly until it's broken down and replaced. You've got a nearly infinite number of them.
DNA and all the machinery around it is surprisingly messy and imprecise. But it all keeps working anyway because organisms have billions or trillions of redundant copies of their DNA.
*take with a grain of salt, I last studied this stuff many years ago.
'Imperfect components' is a value judgement. Apparently an analog world was a necessary part of self-replicating 'mechanisms' arising while floating in the analog seas.
As for something you can easily get your hands on, micrometers are incredible. A simple screw and graduated markings on the shaft and nut give you incredibly precise measurements. You can also find mechanical calculators (adding machines) on eBay. But those really aren't very sexy examples of the concepts.
Analog computers aren't very common anymore. Your best bet is visiting one of the computer museums that house antique machines. Or watching YouTube videos of people showing them off. There's plenty of mechanical flight computers in particular on YouTube.
If you have access to a 3D printer, there's plenty of mechanisms one can print. The antikythera mechanism is a very interesting celestial computer from ancient times, and 3D models exist online.
These machines can calculate ballistic trajectories with incredible accuracy, accounting for the relative motion of the ships, wind speed, and even the curvature of the earth. Those calculations are not at all trivial!
https://youtu.be/s1i-dnAH9Y4?si=oHHJGRqnFx-ydQu1
- Tarantool is some sort of in-memory DB with optional persistence
- Red is a programming language that has made the odd syntax decision to use {} for strings and [] to define scopes
- U++ is one of those all-encompasing C++ frameworks like QT
- Lazarus is a Pascal(?) IDE
- And FASM is a toolkit for building assemblers
I'm struggling to find the common thread across these links, apart from the OP probably being an enthusiast of obscure programming languages
http://www.rebol.com/
https://en.wikipedia.org/wiki/REBOL
It's a protocol/tool for async file transfer, built for disconnected/intermittent connectivity amongst known parties (trusted friends as p2p), allowing even for sneakernet-based file transfer.
It's started as a modern take on usenet, but it boggles my mind how cool it is:
Want to send a TV Series to your friend? send it via nncp, and it will make it through either via line-based file transfer (when connection allows, pull or push, cronjob, etc), or even via sneakernet if there is "someone going that way".
Comms priority system lets you hi-prio message-checking via expensive network link vs bulk file transfer using trunk lines later.
It even can be configured to run arbitrary commands on message receive, to allow indexing/processing of files (like a ZFS-receive hook, mail/matrix ingestion...)
See all the usecases: http://www.nncpgo.org/Use-cases.html
As with many of these cool techs, I just wish I had a good reason to use it =D
You can create highly specialized templates in Lua, and there's a RDBMS extension called Cargo that gives you some limited SQL ability too. With these tools you can build basically an entirely custom CMS on top of the base MW software, while retaining everything that's great about MW (easy page history, anyone can start editing including with a WYSIWYG editor, really fine-grained permissions control across user groups, a fantastic API for automated edits).
It doesn't have the range of plugins to external services the way something like Confluence has, but you can host it yourself and have a great platform for documentation.
Personally I would prefer a wiki with git backend. I wrote one [1] but I dont recommend using it.
https://github.com/entropie/oy
[1] Docusaurus:
https://docusaurus.io/
[2] Tinasaurus:
https://github.com/tinacms/tinasaurus
As an administrator, I wish MediaWiki had a built-in updater (bonus points if it could be automated).
I get that by using the container distributions. I just mount My LocalSettings.php and storage volumes in the appropriate places and I get a new version.
And since I run on ZFS and i take a snapshot before updating if something goes wrong I can rollback the snapshot, and go back to when stuff just worked (and retry later).
https://www.reddit.com/r/Notion/comments/16zon95/are_there_a...
What I wish more people knew was that you don't need to do those things to get value from Nix. Create project specific dev shells that install the packages (at the correct versions) to work with that project can almost replace 90% of the docs for getting setup to work on a project.
conceptually a game changer for me. In practice it's far from a silver bullet (because every language prefers its own package management so you still have to manage those), but when it works it's quite magical.
Or was the issue that you expected them to be portable? Or use commonly known dynamic library locations?
I was more or less pointing out the UX issues with Nix that end up turning many people away.
For example I tried to run pip install yesterday on MemGPT on Nix.
It failed with a C++ error because they use miniconda.
I just created a nix shell with python, pip, etc and ran the pip install command.
Things work fine.
I fell down the Nix rabbit hole, and miniconda was one of the worst things to get working. My first pass used an FHS environment, but eventually I just got the environment.yml file working in micromamba and used that instead. Except micromamba ships it's own linker that I had to override with SHAREDLD, or some random python c++ dependencies wouldn't compile correctly.
I love Nix, but my list of complaints is a mile long. If you want to do anything with opengl in nix, but not on nixos, just give up. NixGl just doesn't really work.
Good luck getting something like Poky (Reference project for Yocto) running in Nix. The only working example puts it in an FHS environment, which uses bubble wrap under the hood. But then because you're in a container with permissions dropped, you can't use a vm. The solution I see in the support forums is roll your own alternative FHS environment based on something else.
/Rant
[0] https://www.jetpack.io/devbox
One docker file and a poetry file works just as well. And is simpler. It's literally the same thing but using os primitives to manage the environment rather then shell tricks. Makes more sense to me to use a dedicated os primitive for the task it was designed to be used for.
Additionally docker-compose allows you to manage a constellation of environments simultaneously. This is nowhere near as straightforward with nix.
I love nix but being honest here. It's not definitively the best.
The biggest reason right now to avoid it is adoption. Most people won't know what to do with a shell.nix
1) not just as well because docker is repeatable, not reproducible
2) not if you need GPU acceleration which is a headache in docker, but not Nix shells
> Additionally docker-compose allows you to manage a constellation of environments simultaneously. This is nowhere near as straightforward with nix.
- devenv.sh - arion - https://flakular.in/intro
> Most people won't know what to do with a shell.nix
The same was once true for Dockerfile
Not sure what you're saying here but most likely you're referring to some obscure pedantic difference. Effectively speaking docker and nix shell achieve similar objectives.
>2) not if you need GPU acceleration which is a headache in docker, but not Nix shells
This is true. But this is the only clear benefit I see.
>- devenv.sh - arion - https://flakular.in/intro
right. So? I said nowhere near as straightforward. This isn't straightforward. It's an obscure solution.
>The same was once true for Dockerfile
False. DockerFiles are much more intuitive because it's just a declarative config. With Nix shell it's mostly people who like haskell or OCaml who are into that style of syntax. I like it but clearly that syntax has not caught on for years and years and years. Quite likely Nix will never catch on to that level too.
Neither is using virtualenvs for Python packages with native extensions.
It is harder to write on average atm, but it's very much worth it to me when it comes to sharing code for development. Also, LLMs help quite a bit when writing nix.
Additionally nix uses shell hacks to get everything working. Docker uses an os primitive DESIGNED for this very use case.
And additionally, because docker uses os primitives you can use docker-compose to manage multiple processes on multiple different environments simultaneously. Something that's much harder to do with nix shell.
Also, one man's "DESIGNED" is another man's hacks. I don't see anything wrong with how nix works. Potato/potato, I guess.
I think I know what you're getting at. nix-shell provides a fast way to get access to that specific shell environment which is a bit more annoying to do with docker. All docker needs to do is provide this interface by default and the only surface level differences between the two techniques is really just the configuration.
>Also, one man's "DESIGNED" is another man's hacks. I don't see anything wrong with how nix works. Potato/potato, I guess.
By any colloquial usage of the term "designed" in this context by any unbiased party, it's obvious Nix is more hacky by any charitable interpretation. NixOS is a layer on top of linux, containers are a linux feature. Thus creating a layer on top of linux to use existing features is the more hacky less elegant solution.
It can actually go in the other direction. Rather then use shell tricks Nix can also use containers under the hood. Overall though the API for docker is superior in terms of editing config files but not switching shells. Additionally the underlying implementation for docker is also superior.
Your main problem is with the API which is just opinionated.
a) Unless you literally write everything in one language, you will have to deal with learning, supporting and fixing bugs in N different package/environment managers instead of just one.
b) If you have a project that uses several languages (say, a Python webapp with C++ extensions and frontend templates in Typescript), then Nix is the only solution that will integrate this mess under one umbrella.
b. C++ is the only one that would benefit from nix here because C++ dependencies are usually installed externally. There's no folder with all the external sources in the project. Even so this can be achieved with docker. If you want you can have docker call some other scripting language to install everything if you want "one language" which is essentially what you're doing with nix.
b. No, docker is not a solution. Docker is another problem and a separate maintenance nightmare.
(Nix solves maintenance problems at scale, Docker explodes them exponentially. I would not ever recommend using Docker for anything except personal computing devices you don't care about.)
There are many python packages that have other dependencies not managed by Python package management. The pain of figuring out what those implicit dependencies are is effectively removed for users when configured as a nix shell.
Additionally, as machine-generated content proliferates, I think having services use something like the web of trust concept for membership would be super powerful. The problem is, of course, the terrible UX of cryptographic signatures. But I think there's a lot of opportunity for the group that makes it easy to use.
[0]: https://en.wikipedia.org/wiki/Web_of_trust
Programmability though
- https://arcan-fe.com/2022/10/15/whipping-up-a-new-shell-lash...
- https://arcan-fe.com/2021/04/12/introducing-pipeworld/
- https://arcan-fe.com/2020/12/03/arcan-versus-xorg-feature-pa...
- https://arcan-fe.com/2021/09/20/arcan-as-operating-system-de...
The latest EU funded 'a12' things are also soooo high concept but not fever dream.
Your book looks great, will check it out.
A Nim talk would be a great fit for the event.
Thanks for mentioning this! I work remote in SC and its nice to hear about a nearby convention.
At the time nimble also required me to have NPM to install the the Nim package manager, Nimble. This was not ideal, but looking at [the nimble project install docs](https://github.com/nim-lang/nimble#installation) it seems like it is now package with the language.
Might try dusting it off for some AoC puzzles this year :)
In python, for historical reasons the logging module uses camelCase while most other modules use snake_case, so it isn’t really possible to use the logging module and maintain a consistent style. This is a non-issue in Nim.
E.g. it's something to check but not an error. You can easily set a config to make them an error or ignore them.
http://nim-lang.github.io/Nim/atlas.html
I call it a docs system rather than static site generator because the web is just one of many output targets it supports.
To tap into its full power you need to author in a markup that predates Markdown called reStructuredText (reST). It's very similar to Markdown (MD) so it's never bothered me, but I know some people get very annoyed at the "uncanny valley" between reST and MD. reST has some very powerful yet simple features; it perplexes me that these aren't adopted in other docs systems. For example, to cross-link you just do :ref:`target` where `target` is an ID for a section. At "compile-time" the ref is replaced with the section title text. If you remove that ID then the build fails. Always accurate internal links, in other words.
The extension system really works and there is quite a large ecosystem of extensions on PyPI for common tasks, such as generating a sitemap.
The documentation for Sphinx is ironically not great; not terrible but not great either. I eventually accomplish whatever I need to do but the sub-optimal docs make the research take a bit longer than it probably has to.
I have been a technical writer for 11 years and have used many SSGs over the years. There's no perfect SSG but Sphinx strikes the best balance between the common tradeoffs.
[1] https://www.sphinx-doc.org/en/master/index.html
They are starting to work towards full sphinx functionality in myst markdown, too.
[1] https://myst-parser.readthedocs.io/en/latest/intro.html [1] https://executablebooks.org/en/latest/blog/2023/new-project-...
This podcast episode is worth a listen for anyone interested in these tools and where they're headed: https://talkpython.fm/episodes/show/354/sphinx-myst-and-pyth...
[0] https://www.sphinx-doc.org/en/master/usage/markdown.html
From Sphinx's Getting Started page:
> Much of Sphinx’s power comes from the richness of its default plain-text markup format, reStructuredText, along with its significant extensibility capabilities.
https://www.sphinx-doc.org/en/master/usage/quickstart.html#g...
I will have to dig into exactly how much parity we're talking here but if it's very strong parity then I redact my previous statement
Thanks for correcting me!
A notable exception is autodoc (automodule, autoclass, etc.), and any other directives that generate more rST. The current workaround is to use eval-rst:
https://myst-parser.readthedocs.io/en/latest/syntax/code_and...
Some more discussion about that in these issues:
https://github.com/executablebooks/MyST-Parser/issues/163
https://github.com/sphinx-doc/sphinx/issues/8018
What I really hope that exists, is a system where I can reuse the documentation (sections) in other pages, ergonomically
I built that system multiple times to do preprocessing with things like including parts or special linking or referencing images from anyhwere
https://github.com/xmonader/publishingtools/tree/development...
[1] e.g. https://myst-parser.readthedocs.io/en/v0.13.7/using/syntax.h...
https://hieroglyph.readthedocs.io/en/latest/getting-started....
This meant I could first write a blog post on learning Clojure as a Pythonista[1]; then turn some code samples and tables and images into slides I could present at a public talk on my laptop or desktop[2]; and then finally publish a public notes document that talk attendees could use to easily study or copy-paste code examples[3]. (The notes are the exact same contents of the slides, just rendered in a simple single-page HTML format, with each slide transformed into a section heading, with permalinks/ToC auto-generated.) So, this is generated HTML from a single .rst source[4], all the way down! And, of course, I could version control and render the .rst file powering the slides / notes / etc. in GitHub.
[1]: https://amontalenti.com/2014/11/02/clojonic
[2]: https://amontalenti.com/pub/clojonic/
[3]: https://amontalenti.com/pub/clojonic/notes/
[4]: https://amontalenti.com/pub/clojonic/notes/_sources/index.tx...
Note: the slides in [2] do not play well on mobile. You are meant to use keyboard arrows to advance and tap “t” to switch into tiled mode (aka slide sorter) and “c” to open a presenter console. The slides are powered by a fork of html5slides, which will look familiar if you’ve seen the JS/CSS slide template that Go core developers use in https://go.dev/talks (they generate those with “go present,” a different tool, though).
More recently, I have also used a similar-in-spirit tool called marp (https://marp.app) for generating technical slides from source, but the output and functionality was never quite as good as rST + Sphinx + hieroglyph. The big advantages to marp: Markdown is used as the source, some tooling allows for VSCode preview, and PDF export is fully supported alongside HTML slides.
I have a soft spot for Sphinx, not only because it was responsible for so much great documentation of Python open source libraries (including Python’s own standard library docs at python.org), but also because the first comprehensive technical docs I ever wrote for a successful commercial product were written in Sphinx/rST. And the Sphinx-powered docs stayed that way for a ridiculously long time before being moved to a CMS.
[1]: https://github.com/psanford/wormhole-william
https://github.com/schollz/croc
There was this fun little number a couple of years back https://redrocket.club/posts/croc/
I'm not sure I can type out, with trembling fingers, how many dollars have been flushed down the toilet of CCSs by businesses that either had no business experimenting with componentized content, or businesses that didn't have resources for training up staff, or vendors who literally evaporated like morning dew after they'd gotten their initialization fees. So just one single story: one prime aerospace vendor I worked with had started their road to S1000D publishing in 2009. Today - at the end of 2023, and more than twenty million dollars later, with a garbage truck full of sweat and blood - that system has not released a single publication to the end user. Not one.
Previous discussions on HN: https://hn.algolia.com/?q=postgrest
Came here to mention Hasura as well (not sure of it's popularity though) https://hasura.io/graphql/database/postgresql
- https://supabase.com/docs/guides/api#rest-api-overview
- https://supabase.com/docs/guides/getting-started/architectur...
Linux namespaces/cgroups but lighter than Docker.
I use it when I want to limit the memory of a Python script:
``` maxmem="56" #GB
firejail --noprofile --rlimit-as=${maxmem}000000000 python myscript.py ```
- gron (Greppable JSON): https://github.com/tomnomnom/gron
- MarkDownload (Markdown Web Clipper): https://github.com/deathau/markdownload
- Lean4 links:
-- Theorem proving: https://lean-lang.org/theorem_proving_in_lean4/introduction....
-- Natural Number Game: https://adam.math.hhu.de/#/g/leanprover-community/NNG4
ESPHome. It's a framework for declaratively building firmware for microcontrollers, based on rules like "This pin is an input with debouncing, when it changes, toggle this".
Contributing to them has probably been the most fun I've had programming in years.
We just need power management, and a C++ implementation of the Native API client. It's so close to being able to replace most of what I'd normally code by hand in Arduino.
https://esphome.io/
RealThunder's fork of FreeCAD: https://github.com/realthunder/FreeCAD
They fix so many issues. Linear patterns can duplicate other linear patterns!
Vorta: It's the best backup technology I've seen. Just an easy guided GUI for Borg, which gives you deduplication. I just wish they let you deduplicate across multiple repositories somehow.
I've been looking for a more convenient way to configure some ESP32-based input devices (similar to macropads). I was interested in QMK, but it doens't support ESP32. So far I've been using MicroPython / CircuitPython, which I generally like, but on multiple occasions I've thought "I wish I could just put this in a config file."
The matrix keypad and key collector components look similar to what I was looking for. Can the key collector be used with other multiplexing methods like shift registers?
You can send keys directly to the key collector from wherever you want, but you'd probably have to configure an individual action for each key, unless there's a feature I'm not seeing.
Maybe you could create a new ShiftRegisterKeypad component?
Lithium Titanate sounds interesting - TIL...
Given that webassembly is a stack language with no GC, i do expect a comeback of concatenative programming some time in the future.
https://www.youtube.com/@siliconvalleyforthinterest1736
I literally made this mistake, creating a wasm interpreter, before I realized it was a terrible runtime bytecode.
It really does give the lightbulb moment. “Don’t try to generate code, that is impossible. Only try to realize the truth… There Is No Code (only data)”
https://github.com/automatic-ripping-machine/automatic-rippi...
Put a DVD/blu ray in a drive and it automatically determines the title, starts ripping, then pops the disc out when it's done.
There's options for post-ripping transcoding also.
However, lately I've come to like llama.cpp and friends, yes it's not ChatGTP miracle whatever but how often do you /actually/ need that? Despite its tremendous popularity, it still seems like something more people should know about. For me, I've had great fun with running LLMs locally and experiencing their different "flavors" from a more "phenomenological" (what is it like to use them) perspective rather than a technological one.
It’s perfect (so far) for my purposes of an extensible data model.
I’m sure others have augmented applications with “generic” data types (like properties and such). You always walk this fine line that if you fall to far you find you’re writing a database on top of a database.
We’ve also in the past fallen into that hole when building a DB schema that we stumble into what we coined the “absurd normal form” or, also colloquially, the “thing-thing” table that relates everything to everything.
Well, RDF is the thing-thing table, and it just embraces it. And for my project it’s a lot of fun. I have structured types, with specialized forms and screens. But, if desired, the user can jump into adding relations to anything. It’s essentially an RDF authoring environment with templates and custom logic to make entities. And in the end they can always dive into SPARQL to find whatever they want.
It’s not intended to work with zillions of data items, it’s just a desktop tool. I always found it interesting early on that the primary metric for triple stores was how fast they could ingest data, I guess nobody actually queried on anything.
Anyway, it’s fun and freeing to work with.
* SSH ForcedCommand. Lots of usecases here, for backups, file storage, git, etc.
* Verilog as a tool for software developers to learn digital electronics. VCS/code/simulation/unit tests are all a lot more familiar and expected for developers.
* Writing tools yourself. There's often decent stable libraries that do 90% of what you want, and the remaining 10% is less effort than dealing with awkward integration with off-the-shelf tools. This relies on having low overhead packaging/deployment, e.g. Nix/Guix/Bazel.
I rely on my home's v6 /56, so I don't have experience with using VMs for this, but I know of a few providers that offer /56 (and above):
* Mythic Beasts and Linode offer a /56 on request. They're not cheap VM providers though.
* https://ifog.ch/en/vps offer /48.
* https://tunnelbroker.net/ offer /48, which can be used via any VPS/home.
https://reddit.com/r/ipv6 for more info.
¹) http://wincompose.info/
²) https://en.wikipedia.org/wiki/Compose_key
https://espanso.org/
- Data diodes (unidirectional networks) - allow you to monitor a network without allowing external control (or only submit info, never ever exfiltrate it)
- GNU Radio - you can play with your audio ports, and learn instinctively how do deal with all the stuff that used to require DSP chips... then apply that knowledge with a $30 RTL-SDR dongle.
- Lazarus - seconding the above... a really good Pascal GUI IDE. The documentation needs work, but it's pretty good otherwise.
Fossil: distributed version control and much more in a single executable, from the creators of SQLite: https://fossil-scm.org/
This has accounted for about 90% of everything I've built since 1985.
Pick code generates my side project: https://eddiots.com/1
> Pick was originally implemented as the Generalized Information Retrieval Language System (GIRLS) on an IBM System/360 in 1965 by Don Nelson and Dick Pick [...]
I seriously miss it.
Every once in a while I try to get back into it. Usually it takes the form of trying (and failing) to get a demo/personal version of UniVerse, but lately I've been poking at ScarletDME a little bit. I'd even pay money (not much since this is just hobby stuff, but some) for UniVerse, but even the cost of it seems to be a closely guarded secret.
I HAVE to code in PICK.
"Unless it comes out of your soul like a rocket, unless being still would drive you to madness or suicide or murder, don’t do it." - Charles Burkowski
(Funny, they named the current support company "Rocket".)
Here's the link to the current Universe trial version (free and good until 04/2025. Get it, install it, and make something with it. Please don't let that part of you die.
https://www.rocketsoftware.com/products/rocket-multivalue-ap...
What's the trick to making that form work? It won't accept my @gmail.com address, and I don't really want to use my work email address and potentially mis-represent things. Especially since my work used to use one of Rocket's products.
If you have concerns about doing that, you can just download it from my website at
http://eddiots.com/UVTE_WINDOWS_11.4.1.zip (You may have to cut and paste this link into a new tab. HN doesn't seem to like this.)
If you have any problems or need the UNIX version, just reply here or contact me. email on my profile. Let me know how it goes.
My next phase is to put the PICK-generated svg into codepen and provide links to show how to draw the art with code.
Couple of things I like
- tarantool https://www.tarantool.io/en/
- rebol/red-lang https://www.red-lang.org/
- U++ : https://www.ultimatepp.org/
- lazarus: https://www.lazarus-ide.org/
- fasm: https://flatassembler.net/
"vopono is a tool to run applications through VPN tunnels via temporary network namespaces. This allows you to run only a handful of applications through different VPNs simultaneously, whilst keeping your main connection as normal.
vopono includes built-in killswitches for both Wireguard and OpenVPN."
* Graph-relational database
* Queries return objects linked to other objects through properties, not rows
* ... But it's still Postgres under the hood
* Open source
the cyber swiss army knife
- in-process databases (rocksdb, sqlite)
- FoundationDB
- C/C++ and low level programming in general (I wished I learned those instead of js when I was younger)
- State Machines, Actor Model (Orleans Net), Event Sourcing
- Bittorrent for other things than pirating (it looks like it's dying)
- Arduino and similar
- Seastar
- Arrow (ecosystem)
- costs next to nothing to charge
- fast and fun to get around
- never pay for parking
- cheap maintenance
- hauls groceries easily
- good exercise
Laws making that illegal are extra stupid since it's relatively hard to kill a pedestrian with a bicycle but downright easy to kill a cyclist with a car.
No, they shouldn't. The sidewalk is for pedestrian traffic; that's what the "walk" in the name signifies.
> Laws making that illegal are extra stupid since it's relatively hard to kill a pedestrian with a bicycle
Sidewalks can't handle much bike traffic, are suboptimal for it (which is why purpose-built separated bicycle trails are built like roads, not sidewalks), and are in many places less safe for bicyclists, crossing driveways with less visibility for drivers and bicyclists than is the case with the road proper.
Sorry you're forced to slow down and pay attention occasionally
- Matrix. It's pretty popular but I see way too many open source projects still saying "join our Discord!" instead of "join us on Matrix!"
Python took 20 years after its introduction to become popular as today, thanks to its more intuitive syntax that was based on ABC.
I really hope after 20 years of its introduction that D will be appreciated and becomes a de-facto language not unlike Python is now. Perhaps even more popular with the advent of connected tiny embedded sensors and machine in the form of IoT are upon us.
A whole lot of innovation from the competitive debate community has quietly existed for decades now. Hopefully one day SV discovers all the cool shit debaters have been building for themselves.
[0] https://lv2plug.in/ [1] https://lv2plug.in/ns/ext/atom
Edit: Hydrocolloid blister plasters
- I'd like Emacs/org-mode knowledge common at least starting from universities because we need the classic desktop model and Emacs is the still developed piece of software implementing it alongside with Pharo, but Pharo is usable only to play and develop while Emacs is ready to end-users usage with a gazillion of ready-made packages;
- feeds, in the broad sense, meaning XML automation on the net/web so I can get my bills just from a feedreader, all transactions digitally signed, so both party have a proof (ah, of course a national PKI is mandatory), news, anything in the same way making my information mine in my hands instead of wasting time in gazillion of crappy services;
- IPv6 with a global per host, so we can finally profit of our modern fiber-optics connections instead of being tied to someone else computer, i.e. "the cloud";
- last but just aside: R instead of spreadsheets for the business guys, so they do not produce and share anymore crappy stuff, LaTeX for similar reasons to produce nice looking pdfs...
https://imba.io/
For publishing documentation / to build the web site: Antora [2].
AsciiDoc has a bit more features compared to Markdown which allows for a richer and more pleasant presentation of the docs.
Antora allows you to have the project documentation in the actual project repositories. It then pulls the docs from all the different repos together to build the site. This also allows you to have the released product versions go in-synch with the docs versions. Antora builds each version of the product as part of one site. The reader can explore different product versions or navigate between pages across versions.
===
[1] https://asciidoc.org/
[2] https://antora.org/
It's a really simple alternative to something like wireguard.
- DNSSEC+DANE - It's half-assed deployed but there's a lack of end-user UX
- wais - search before gopher
- afs - distributed fs
- discard protocol - basically, a network-based /dev/null
- humans.txt - Not around as much as it was
- makeheaders - Auto-generated C/C++ headers
- man page generators - ronn and help2man
- checkinstall - The automatic software package creator
- bashdb and zshdb
- crystal - Compiled Ruby-ish
- forth - Powered the FreeBSD bootloader menu for many years and word processors (typewriter-like almost computers)
- ocaml - The best ML, used by Jane Street and Xen
- pony - A language built around an arguably better GC than Azul's C4 with arguably stronger sharing semantics than Rust
- prolog - Erlang's grandpa
- rpython - PyPy's recompiler-compiler
- pax - POSIX archives
- shar - shell archives - Self-extracting archives that look like scripts at the beginning
- surfraw - Shell Users' Revolutionary Front Rage Against the Web - founded by Julian Assange
- step-ca - A Go-based PKI server
- dmraid - Because it works
- X10 - Before WiFi and IoT, there was the Firecracker: a parasitic power serial port RF outlet controller
- FreeBSD - It's not unknown or obscure per se, but it powers many important things in the civilized world without getting much credit
- :CueCat - A dotcom era barcode reader that was given away
- Xen - If you need to run a VPS but can't ju$tify u$ing VMware
- KataContainers - k8s but with VM isolation
- stow - software management by symlinks
- habitat - similar philosophy as nix but not as clean and functional and almost like Arch PKGBUILD but with more infrastructure around it
- JTAG - debug all the things
- in-circuit emulators (ICEs) - hardware-based debuggers
- polarized light sources - easier to see things under the bi/trinocular microscope
Polarization is indeed the magical capability property of EM waves that is currently under-utilized or under-rated.
Currently I am working on highly reliable nd robust polarized wireless systems that hopefully will be part of next gen 6G PHY.
and here for a book to learn it from: https://book.simply-logical.space/src/simply-logical.html
I think it is the closest thing to a "tool for expressing thought" with a proof procedure, which presently exists.
The openSUSE build system is also great for building packages for a lot distros. It's not just for openSUSE.
https://kozubik.com/items/2famule/
One of my favorite things about the old C2 Wards Wiki is that it's like an archaeological site where time is frozen in this period and you can browse through preserved arguments about how Smalltalk and Extreme Programming will take over the world.
You can also try out a derivative like Inferno: https://www.vitanuova.com/inferno/ and https://en.wikipedia.org/wiki/Inferno_(operating_system)
https://news.ycombinator.com/item?id=38485309
It offers a more compact i'd say approach to develop and it's quite straightforward.
I only used it for small GUI applications, but you can see what others been building https://www.ultimatepp.org/www$uppweb$apps$en-us.html
"Haxe can build cross-platform applications targeting JavaScript, C++, C#, Java, JVM, Python, Lua, PHP, Flash, and allows access to each platform's native capabilities. Haxe has its own VMs (HashLink and NekoVM) but can also run in interpreted mode."
It's mostly popular in game dev circles, and is used by: Nortgard, Dead Cells, Papers Please, ... .
No experience with other platforms yet. Probably iOS will follow at some point, but I want it to be more feature complete first.
For the rest it's a very nice language with the usual tools.
The multiplatform part is probably why it's so popular with game developers, since you can target so many client plaforms with only one codebase.
I’ve been using it for over 5 years now [1], and it’s as good as ever. It’s way faster than any other chat app I’ve used. It has a good UI and conversation model. It has a simple and functional API that lets me curl threads and write blog posts based on them.
(only problem is that I Ctrl-+ in my browser to make the font bigger – I think it’s too dense for most people)
(2) re2c regex to state machine compiler - https://re2c.org
A gem from the 90’s, which people have done a great job maintaining and improving (getting Go and Rust target support in the last few years).
I started using it in 2016, and used it for a new program a few months ago. I came to the conclusion that it should have been built into C, because C has shitty string processing – and Ken Thompson both invented C AND brought regular languages to computing !!
In comparison, treesitter lexers are very low level, fiddly, and error prone. I recently saw dozens of ad hoc fixes to the tree-sitter-bash lexer, which is unsurprising if you look at the structure of the code (manually crawling through backslashes and braces in C).
https://github.com/tree-sitter/tree-sitter-bash/blob/master/...
These fixes are definitely appreciated, but I think it indicates a problem with the model itself.
(based on https://lobste.rs/s/endspx/software_you_are_thankful_for#c_y...)
[1] https://www.oilshell.org/blog/2018/04/26.html
I'd love for it to be back online but can't find the author.
now I just use chatgpt
Of course someone will reply with a more complete language, but I'll start by throwing out array-based languages, in the form of J: https://www.jsoftware.com/#/
Once you really get your head around composing verbs, really working with arrays, and using exponents on functions, it's mind-expanding.
- VSCode devcontainers: https://code.visualstudio.com/docs/devcontainers/containers
Also, my company (VMware) has a really powerful YAML templating engine called ytt. I originally hated it and dunked on it constantly but have grown to love it. It makes creating composable and modular YAML really easy, which is extremely unfortunate that this is a real thing, but when you need it, you need it.
Lastly, Cucumber isn't _unknown_ unknown, but I wish it was more widely used. Behavior testing is really useful even if the program has great test coverage underneath. Being able to express tests in pure English that do stuff is powerful and can be a bargaining chip for crucial conversations with product sometimes if done correctly. I mean, yes, we have GPTs that can write tests from prompts written in your language of choice and GPT Vision can manipulate a browser, but Cucumber is an easy stand-in IMO that is cheap and free!
I dream of a CMS akin to WordPress, but developed in LSP.
Lua is lean, with minimal syntactic sugar, and it feels like a 'complete' language. Therefore, we don't anticipate any additional bloat in the future.
Of course they also might just be a fan of J-Pop https://en.wikipedia.org/wiki/Gokuraku_Jodo
I've always wanted to build a digital clock entirely running on fluids. It would use fluid gates, and present a digital display by pushing blobs of coloured immiscible liquids back and forth through glass tubes (perhaps arranged as a seven-segment display). The counter itself would be made using fluid gates (which I don't know how to make). It would be slow; but for a wallclock with minute precision, you hardly need nanosecond gates.
So I wish "fluidonics" were popular.
Try the demo: https://store.steampowered.com/app/1528120/ComPressure/
If you have a distributed system, dont want to spend a lot of time on wrestling with ELK or fell out of your chair when opening the Splunk bill. Loki offers 90% of the features with OSS model and very simple deployment.
Complete game changer. Very simple to understand data model, alerts on logs, extract and group data from structured or unstructured logs, combine logs across sources, scales to both small and big system.
It’s surprising other tools in the same space have such a hard time hitting the right balance between capability and cost+complexity. Logs are so essential you would think the tooling space around it was better.
It was truly interesting. Long story short, you stored your objects in the database, along with other objects. No object-relational mismatch.
Queries meant taking a subset of a graph (of objects). It was fast and performant, and fairly solid.
It's essentially the result of asking "what if database development had taken a different turn at some point?".
Of the owning company would release it under some kind of open source license (maybe open core, BSL or one of those new revenue-friendly licenses) it could probably get very very popular.
(I only wish I was being sarcastic.)
Tokenizes chinese text into "words" for learning purposes and then renders the text in a GUI where you can click on a word to get the definition. It's not perfect, but a LLM fine tuned for it will eventually result in much better "tokenization".
https://docs.saltproject.io/en/getstarted/
https://stackstorm.com
Are these genuine Bulma customers happy to support a product they use, or am I witnessing some new way of money laundering here? What is the Phone Tracking app doing there? I mean, Bulma needs all the support it can get, but what does a casino want from Bulma?
https://sshuttle.readthedocs.io/en/stable/
A few people on HN are into Buddhist meditation - I read mentions of Culadasa's The Mind Illuminated or Ingram's book. Indeed I've done several Vipassana and Zen retreats, but they just aren't as integrated as yogas 8 limbs. They may lead to the same place eventually, but I think they take much longer.
If there was a device that made people feel as good as the awakening nervous system, the inventor would be a multi-millionaire, no question (in fact I think I heard a Western monk is involved in a startup to try to create one). It is truly unparalleled and something actually worth experiencing (from what I've seen so far).
For those interested in resources I've found helpful to experience these changes for myself here are 2 I recommend:
* https://www.aypsite.com/10.html
* https://morrismethodsandmore.com/schedules/
https://github.com/codecando-x/peregrine
get a mailbox people hand write letters to it
they get copied and redistributed
people can write to each other
or a paper version of HN printed like a newspaper
How is your experience with that?do you have it self-hosted or use their offering?
It’s fun.
It creates VMs. Mostly Ubuntu Linux but there’s a slightly demented way to deploy Windows boxes too.
Hypervisor support is provided by a plugin system called the Cloud Provider Interface. Last I heard vSphere, GCP, Azure and AWS are all reasonably well tested and maintained by their respective companies. Open Stack technically is there but it’s a nightmare and not well commercially supported. I’ve heard of stuff being deployed to Alibaba and Oracle but never seen those systems myself.
In practice this is mostly used to manage VMs into vSphere clusters.
It should be used more for building Data Pipelines specifically.
[0] https://github.com/clvv/fasd [1] https://github.com/WillForan/fuzzy_arg [2] https://zim-wiki.org/ [3] https://github.com/WillForan/zim-wiki-mode [4] https://www.dokuwiki.org/xmlrpc [5] https://github.com/flexibeast/emacs-dokuwiki [6] https://github.com/WillForan/dotconf/blob/master/bash/PS1.ba... -- bash debug trap to update prompt with escape codes that set the title to previous run command -- to eg. search windows for the terminal playing music from 'mpv'
Also how do you use zimwiki? I've been trying it for a month and I don't find it that great compared to something like Obsidian or QOwnNotes or even TiddlyWiki. Do you have a specific workflow?
I have an ugly and now likely outdated plugin for Zim to help with this. There's a small chance the demo screenshots for it help tie together what I'm trying to say. https://github.com/WillForan/zim-plugin-datelinker
On the tech side: My work notes (and email) has shifted into emacs but I'm still editing zimwiki formatted files w/ the many years of notes accumulated in it Though I've lost it moving to emacs, the Zim GUI has a nice backlink sidebar that's amazing for rediscovery. Zim also facilitates hierarchy (file and folder) renames which helps take the pressure off creating new files. I didn't make good use of the map plugin, but it's occasionally useful to see the graph of connected pages.
I'm (possibly unreasonably) frustrated with using the browser for editing text. Page loads and latency are noticeably, editor customization is limited, and shortcuts aren't what I've muscle memory for -- accidental ctrl-w (vim:swap focus, emacs/readline delete word) is devastating.
Zim and/or emacs is super speedy. Especially with local files. I using syncthing to get keep computers and phone synced. But, if starting fresh, I might look at things that using markdown or org-mode formatting instead. logseq (https://logseq.com/) looks pretty interesting there.
Sorry! Long answer.
Also your "interstitial journaling" paradigm seems great, I'll try to apply it because I enjoy grounding what I do into some loose chronology kinda.
Thanks again for taking the time to expound on your approach!
E-ink diplays...
Flexion pens
XMPP
Gemini Protocol
Urbit
Not many people seem to know about it and everyone I show it to loves it!