there's a few more semantic families: verilog, petri nets and variants, Kahn process networks and dataflow machines, process calculi, reactive, term rewriting, constraint solvers/theorem provers (not the same with Prolog), probabilistic programming,
plus up and coming (actual production-ready) languages that don't fit perfectly in the 7 categories: unison, darklang, temporal dataflow, DBSP
It may feel like a little bit of cheating mentioning the above ones, as most are parallel to the regular von Neumann machine setup, but was meaning for a while to do an article with 'all ways we know how to compute (beyond von Neumann)'.
> was meaning for a while to do an article with 'all ways we know how to compute (beyond von Neumann)'.
Would be very glad to read this.
In the meantime, I reproduce a part of an article by Steve Yegge:
---
What Computers Really Are
Another realization I had while reading the book is that just about every course I took in my CS degree was either invented by Johnny von Neumann, or it's building on his work in mostly unintelligent ways.
Where to start? Before von Neumann, the only electronic computing devices were calculators. He invented the modern computer, effectively simulating a Universal Turing Machine because he felt a sequential device would be cheaper and faster to manufacture than a parallel one. I'd say at least 80% of what we learned in our undergrad machine-architecture course was straight out of his first report on designing a programmable computer. It really hasn't changed much.
He created a sequential-instruction device with a fast calculation unit but limited memory and slow data transfer (known as the infamous "von Neumann bottleneck", as if he's somehow responsible for everyone else being too stupid in the past 60 years to come up with something better. In fact, Johnny was well on his way to coming up with a working parallel computer based on neuron-like cellular automata; he probably would have had one in production by 1965 if he hadn't tragically died of cancer in 1957, at age 54.)
Von Neumann knew well the limitations of his sequential computer, but needed to solve real problems with it, so he invented everything you'd need to do so: encoding machine instructions as numbers, fixed-point arithmetic, conditional branching, iteration and program flow control, subroutines, debugging and error checking (both hardware and software), algorithms for converting binary to decimal and back, and mathematical and logical systems for modelling problems so they could be solved (or approximated) on his computing machine.
Von Neumann may possibly have been the smartest man to ever live, but giving him credit for all of this is too much, brushing aside many other inventors (oft independent, to his credit).
I volunteered to do the formula parser, thinking it sounded like a fun challenge.
I was stumped for a week, until I realized I could rewrite the formulas into a form I knew how to parse. So it would rewrite 1+1 into ADD(1,1) and so on.
I also refused to learn regex, so the parsing code was "interesting" ;)
I recall a comment from a colleague. "Okay, Andy says it works. Don't touch it." XD
Guy from another group used regex and his solution was 20x shorter than mine.
Regular expressions are probably not enough for parsing formulas (depending of course on the exact task given), they usually are at least a context free language.
Regular expressions are definitely enough for turning characters into tokens, after which a simple recursive descent parser is vastly more straightforward to write. Lexing is optional, but generally advised.
Plus up and coming (not quite production-ready IMO, but used in production anyways): ChatGPT and the like.
Of course, it’s debatable whether they are programming languages, but why wouldn’t they be. They aren’t deterministic, but I don’t think that is a must for a programming language, and they are used to let humans tell computers what to do.
You can just get the code without buying the book, learn with Simply Scheme or any other
book and apply the functions from the code, the solvers are really easy to understand.
Great list of languages that don't fit the conventional families. I've been curious about some of them, like Petri nets and term rewriting, and will enjoy exploring the others.
Found a working link to the paper about propagators.
plus up and coming (actual production-ready) languages that don't fit perfectly in the 7 categories: unison, darklang, temporal dataflow, DBSP
It may feel like a little bit of cheating mentioning the above ones, as most are parallel to the regular von Neumann machine setup, but was meaning for a while to do an article with 'all ways we know how to compute (beyond von Neumann)'.