Ask HN: Has anybody built search on top of Anna's Archive?
289 points by neonate 6d ago 146 comments
Ask HN: Is there any demand for Personal CV/Resume website?
8 points by usercvapp 1d ago 17 comments
Yes, the Apple II MouseCard IRQ is synced to the VBL
63 mmphosis 29 5/8/2025, 12:19:55 PM colino.net ↗
The development of that code was very painful. At the time, there was no external debugger. The moment you enabled the interrupt, your interrupt handler would get called, and it would try to program a context switch (something the 6502 is definitely not supposed to do). If you had any bug in there, your Apple II would be completely frozen, all you could do is reboot, and try to guess what went wrong and try again.
when writing demos and games that have mockingboard support it's always tempting to take advantage of the timers on the 6522 chips on it, though it does mean that the programs wouldn't have been useful for most people back in the 80s/90s
If you ever get a time machine: The SSC could make an interrupt. So could the Microsoft card. There was also something called the Thunderclock card that could do like 100hz.
In its defense, it also predates a lot of what we take for granted. The idea that a game might want to scroll smoothly or update at 50/60 FPS without flickering just didn't occur to people in 1979.
Cost was the innovation for those early home computers. They weren’t doing anything remarkable, except for doing it at a price ordinary people could actually afford. If you were willing to spend more (because the computer ran your payroll, or because the computer got people to drop quarters into it all day long) you could get much more capable stuff.
These days you can drop a few hundred bucks and get something that’s not too far off from the best that money can buy. The main difference between a cheap PC and what passes for a “supercomputer” these days is that the supercomputer has much better interconnects and there’s just a lot more of it.
It's been a real trip watching the accumulation of exponential improvements the past 50 years.
Atari 800 Home Computer System was released in 1979 and had both bitmap and character graphics as well as sprites.
The Commodore 64 is _so_ much better for graphics and sound it's not even funny, but if you look at the timelines, it came along a _while_ later than the Apple II.
Woz is one of the greatest engineers of the 20th century, and the Apple II demonstrates his talent. But his brilliance at simplifying things always straddles the line between optimized and overoptimized. The Disk II might be his greatest feat at doing more with less, while the video circuitry falls just into overoptimization, given the color fringing, NTSC dependence for color, and lack of lowercase. Integer BASIC is somewhere in the middle; great performance (especially given (or maybe because) Woz knew nothing about mainstream BASIC), but the code is so tightly written that it was easier for Apple to license Microsoft BASIC than to add floating-point code to Woz's work.
Without the mouse IRQ, if one wants to support the whole line of Apple II computers, one has to vapor lock on the II+, remember that the IIgs's $C019 high bit means the inverse than the IIe, and the //c requires more trickery around it. (cf https://github.com/cc65/cc65/blob/master/libsrc/apple2/waitv...)
However, for a machine released in 1983 (the Apple IIe) it is indeed very odd. But the IIe is an odd machine in many ways.
The Apple II platform stagnated as Apple poured all their resources into the Apple III (which has all those features and much more).
The Apple II refused to die, so Apple assigned a pair of engineers to design a cost-reduced version of the Apple II, and this became the IIe. The goal was only to minimize manufacturing costs, so the new features like timers were off the table.
The IIe became an unexpected smash hit in the home and education markets (stealing those markets from the 128k Mac), and only then did Apple devote some new resources to the platform (and reposition the 128k Mac as a laughably underpowered productivity machine).
The Apple IIc (1984) was the first Apple II to get a proper modern makeover. Of course it was a flop, while the odd-ball IIe continued to fly off the shelves.
Thus there was downtime for an otherwise idle chip to do some work, and yes, saving a few dollars or 10s of dollars in 1970s money was a LOT of money!
Lots of browsers don't have reader mode, or it sucks. Chrome, for example, puts it in a tiny narrow sidebar where it is ironically difficult to read. Images are removed or mangled. It's a mess. And you don't always have a choice of which browser you get to use anyways.
The responsibility to ensure text is readable lies with websites, not with browser features that a user may or may not have.
Your browser doesn’t have it, or it’s bad? Use a different one!
Responsibility? Responsibility and five bucks will get you a coffee at Starbucks.
You can solve the problem, or talk about how other people should solve it. Do whichever one you prefer, but only one of them actually produces the desired outcome.
If you actually talk publicly about how it should be properly addressed in the first place, you add to societal pressure so other people do things right. That's the only one that produces the desired outcome more broadly and long-term.
We have valuable things like accessibility standards because enough people complained. Or would you rather we not had those because those people were "just complaining"?
> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.
And there are things you can critique about Chrome, but it's certainly not "crappy". It's also the most popular by far, even on devices where it's not default.
So please don't tell people not to complain, and please don't call them whiny. It's entirely inappropriate, and not helpful at all. HN is not the place for these things.