Exponential Functions and Euler's Formula (deaneyang.com)
6 points by surprisetalk 1h ago 0 comments
What works (and doesn't) selling formal methods (galois.com)
32 points by azhenley 3d ago 3 comments
Learning from the Amiga API/ABI
47 danny00 37 6/1/2025, 2:21:29 PM asm-basic-coder.neocities.org ↗
There was a single fixed location in the entire system (address 0x4 aka ExecBase), and everything an AmigaOS application required was 'bootstrapped' from that one fixed address.
All OS data structures were held together by linked lists, everything was open and could be inspected (and messed up - of course terrible for security, but great for learning, exploring and extending).
Everything I learned about after it was a huge disappointment, including Mach. Particularly because it demystified the OS. Just a bunch of of lists, and due to the OO nature, they were the same kinds of lists.
Here's what a node looks like: next, previous, a type, a priority, a name.
A task? A node. With a bunch of extra state.
An interrupt? A node. With a lot less extra state.
A message? A node. With an optional reply port if the message requires a reply.
Reply port? Oh, that's just a port.
A port? Yeah, a node, a pointer to a task that gets signaled and a list of messages.
How do you do I/O? Send special messages to device ports.
No "write() system call", it's queues at the lowest levels and at the API layer.
https://en.wikipedia.org/wiki/Carl_Sassenrath
You can still have that Amiga feeling on old PCs by using AROS: https://aros.sourceforge.io/
On Macintosh, the whole GUI ran practically in the active app's event loop. The whole system could be held up by an app waiting for something.
Microsoft made the mistake of copying Apple when they designed MS-Windows. Even this day, on the latest Windows, which although it has had preemptive multitasking since 1995, a slow app can still effectively hold up the user interface thus preventing you from doing anything but wait for it.
When Apple in the late '80s wanted to make their OS have preemptive multitasking, they hired the guy who had written Amiga's "exec" kernel: Carl Sassenrath.
Could you explain what you mean here? If you were to make your event loop or wndprocs hang indefinitely it would not hang the Windows interface for the rest of the machine, it would just cause ANR behavior and prompt you to kill the program. As far as I can remember it's been that way since at least Windows 2000.
(edit: and to be clear, I did read the article and see what it said, but without more detail I'm not 100% sure what it really looks like in practice, and why it would be less likely for applications to have situations where they become unresponsive.)
I remember the Amiga had the checkered beach ball bouncing demo and others copied it, then on the Amiga they opened up several copies of the demo all multitasking and bouncing at the same time.
The only downside of the Amiga was the dreaded Guru Meditation Errors when memory went corrupt or something. IIRC AmigaDOS/Workbench had no protected memory features.
This was a limitation of the original MC68k CPU architecture. Though the Amiga operating system was indeed designed to leverage a single address space throughout, which made it significantly harder to retrofit memory protection after the fact.
Technically the Amiga could display a rock solid hires picture but only on a special monitor that I personally never saw.
The priority on the Mac was to have a high quality monitor for black and white graphics. They put a lot of effort into drawing libraries to make the most of in-built display.
The result was that the Amiga was perfectly fine for playing games or light word processing but if you actually needed to stare at a word processor or spreadsheet for 8 hours a day you really wanted a Mac.
Apple in the 90's was cirlcing the drain, nobody wanted an overpriced black-and-white computer except die-hard apple fans, and Apple only exists today because Microsoft bailed them out. Too bad Microsoft didn't invest in Amiga instead.
In what universe 512x342 was better that 640x400 ?
There was an add-on called a "Flicker Fixer" that cached the video signal and emitted VGA-signals at twice the pixel clock and horizontal refresh rate. The Amiga 3000 had one built in.
The ECS and AGA chipsets supported "Productivity mode" that emitted VGA signals but ECS supported only four colours in this mode. All games used the TV modes. "Multisync" monitors that could switch between VGA and television refresh rates were expensive, so few people had them.
Also remember the Amiga was competing with the Mac II line for most of its life. Yes, it was much more expensive... but we are comparing specs, and you could get Mac II displays that supported 256 colors out of 16 million (24-bit.) The Amiga didn't have 24-bit color until 1992.
You cannot be serious. I provided the ACTUAL specifications of the screen resolution of both platforms, and somehow you still say the Amiga was "underwhelming in resolution" when it actually had MORE pixels in both horizontal and vertical than the Macintosh? How can you actually say this? The Amiga had 256,000 pixels, the Macintosh had only 175,104 pixels. The numbers do not lie. The Amiga had 80,896 MORE pixels than the Macintosh. PAL mode offered even more pixels on the Amiga. You're just plain wrong.
FWIW, I also had both platforms, and vastly preferred the Amiga, not just for the higher screen resolution, but also the 4096 colors it provided vs. the 2 colors of the Macintosh. And the far better multitasking, stereo 14-bit sound, amazing games, AREXX, and a lot more. Mac was always way behind the Amiga, in every single way including resolution.
The Amiga was ahead of its time in many ways, and the pre-emptive multitasking was fantastic, but claiming it was some paragon doesn't help anyone. If you wanted a fun home machine attached to a TV, it was great. Even a fun home machine attached to a monitor. If you wanted a business machine with a monitor, it wasn't the safest or best choice, if only due to a lack of software.
That being said I preferred the Amiga.
The Amiga was a bargain in comparison, but it was not without its flaws, like all early machines. I had an A500 with a 1084 monitor, and the flicker at high res was bothersome to me. I later upgraded to an A3000 w/VGA monitor, and it was a vast improvement. I ran at 640x400 for everything at that point.
I think you are underestimating the price of "flicker fixers" at the time. I looked up the price of a Microway flicker fixer in an old Amiga World from 1988: Over $500. You also had to add an a VGA monitor: another $400.
And the earliest ARM machines ran rings around the Amiga because they had a custom-designed RISC CPU, so they could dispense with the custom co-processors. (They still cost a lot more than the Amiga, since they targeted the expensive education sector. Later on ARM also got used for videogame arcades with the 3DO.)
By contrast, there's a story about some Microsoft engineers taking a look at the Macintosh and asking the Apple engineers what kind of custom hardware they needed to pull off that kind of interface. The Apple guys responded, "What are you talking about? We did this all with just the CPU." Minds blown.
The designers of the Mac (and the Atari ST) deserve mad credit for achieving a lot with very little. Even though, yes, the Amiga was way cooler in the late 80s.
I know this first hand, because I got my first email address with CompuServe, running their software under emulation, while using my Amiga's dial-up modem. (I had to sneak the Mac ROM images from the computers at school...)
"no dynamic linking" (by implementing dynamic linking)
"no zombies" (as long as your programs aren't buggy)
I fail to see any meaningful distinction from what we have today. If it was more reliable, it was due to being smaller in scope and with a barrier to entry.
On modern Linux systems you can even do separate sets of memory permissions within a single process (and single address space), with system calls needed only at startup; see `pkeys(7)`.
https://www.man7.org/linux/man-pages/man7/pkeys.7.html
(note however that there aren't enough pkeys available to avoid the performance problem every microkernel has run into)