Awesome! Reminds me of the good old days of QuickBasic and SCREEN 13, when you could write very small programs with fullscreen graphics.
I still have not figured out how to do fullscreen graphics on my Mac.
krackers · 2h ago
>how to do fullscreen graphics on my Mac
You can't, you don't have direct access to the framebuffer. Unless by "fullscreen" you just mean spanning from end-to-end in which case you can create an opengl or metal view and just set the fullscreen style mask.
jebarker · 1h ago
> You can't, you don't have direct access to the framebuffer.
Why is this the case? What would be the problem with allowing it?
krackers · 1h ago
It fits Apple's modus operandi to enforce things UI/UX wise, I assume in this case they don't want end-apps to be able to bypass the compositor (and e.g. prevent alerts from showing on the screen or whatnot).
They used to allow it, but they removed the API after 10.6
I guess on modern macOS CGDisplayCapture() is the closest example that still works (although clearly there is still some compositing going on since the orange-dot microphone indicator still appears, and you can get the dock to appear over it if you activate mission control. I'm guessing it does the equivalent of a full-screen window but then tries to lock input somehow).
Bhulapi · 2h ago
My first experience with programming was with QuickBasic. You just brought back some memories, wish I still had all of those old programs around.
markisus · 3h ago
Can someone explain what “the framebuffer” is? I’m familiar with OpenGL programming where the OS can provide a framebuffer for an application but I’m confused about whether there is a global framebuffer for the entire desktop. Is this a Linux specific concept?
Bhulapi · 3h ago
As far as I know, a framebuffer can mean a lot of things depending on hardware and implementation, but it was used to refer to actual memory that would contain pixel values that would eventually be written to the screen. In Linux, this is abstracted by the framebuffer device, which is hardware independent (you can actually have several fbdevices, which if I'm not mistaken end up referring to different monitors usually). What's convenient about the implementation is that these devices still work as normal memory devices, which means you can read/write as you would any other memory. Some more info: https://www.kernel.org/doc/html/latest/fb/framebuffer.html
fc417fc802 · 32m ago
I'll preface this by saying that I may have some misconceptions. Other people much more knowledgeable than I am have posted summaries of how modern graphics hardware works on HN before.
My understanding is that modern hardware is significantly more complicated at the lowest levels and (at least generally) no longer has a dedicated framebuffer (at least in the same sense that old hardware did).
My understanding of the memory access provided by fbdev is that it's an extremely simple API. In other words an outdated abstraction that's still useful so it's kept around.
An example of this complexity is video streams utilizing hardware accelerated decoding. Those often won't show up in screenshots, or if they do they might be out of sync or otherwise not quite what you saw on screen because the driver is attempting to construct a single cohesive snapshot for you where one never actually existed in the first place.
If I got anything wrong please let me know. My familiarity generally stops at the level of the various cross platform APIs (opengl, vulkan, opencl, etc).
klank · 46m ago
Unless you're deep in a conversation with a graphics nerd, when "the framebuffer" is referenced, what the person normally means is some area of memory, accessible programmatically, that directly represents the pixels displayed on the screen. No fancy windows, vectors, coordinates, just raw memory and values that are the literal values the screen is showing.
In practice, it's not literally that, but in practice, it acts/works like that.
rjsw · 2h ago
On Linux and on other operating systems that have reused the Linux DRM drivers, you can run OpenGL applications from a virtual terminal text console. Examples are kmscube [1] and the glmark2 benchmark suite.
What does "in a TTY" context mean here? It doesn't mean in a terminal window, right?
freeone3000 · 56m ago
It does. Terminals, including without X, are frequently graphical devices, allowing for full-color graphics without needing Xlib or Wayland. This allows you to more easily manipulate that capability.
nimish · 2h ago
Interesting, I guess you could port LVGL to this and get a full GUI?
Bhulapi · 2h ago
I think trying to use anything from LVGL in this project would reduce to essentially just using LVGL. It's more of a project to try and build most of the components from "scratch", i.e. use as few external libraries as possible.
cellis · 2h ago
Super cool! Looks small enough to still be grokkable!
actionfromafar · 2h ago
Any license on this?
Bhulapi · 2h ago
Added MIT license
mouse_ · 3h ago
Don't type commands from the Internet, especially as root, especially when dd is involved. That being said,
If you're ever bored, from a TTY, type
sudo dd if=/dev/urandom of=/dev/fb0
This provides a nifty demonstration of how both the framebuffer and urandom works.
you can also take a sort of "screenshot" in a tty by typing dd if=/dev/fb0 of=./shot.fb
and then you can view it by flipping those filenames around, so that the shot.fb is now the input and /dev/fb0 is now the output.
Bhulapi · 3h ago
Writing urandom to the framebuffer is a joy in and of itself. You actually reminded me to have users add themselves to the video and input group (which does require root privileges usually), but this way they can then run the library without sudo.
I still have not figured out how to do fullscreen graphics on my Mac.
You can't, you don't have direct access to the framebuffer. Unless by "fullscreen" you just mean spanning from end-to-end in which case you can create an opengl or metal view and just set the fullscreen style mask.
Why is this the case? What would be the problem with allowing it?
They used to allow it, but they removed the API after 10.6
https://developer.apple.com/library/archive/documentation/Gr...
I guess on modern macOS CGDisplayCapture() is the closest example that still works (although clearly there is still some compositing going on since the orange-dot microphone indicator still appears, and you can get the dock to appear over it if you activate mission control. I'm guessing it does the equivalent of a full-screen window but then tries to lock input somehow).
My understanding is that modern hardware is significantly more complicated at the lowest levels and (at least generally) no longer has a dedicated framebuffer (at least in the same sense that old hardware did).
My understanding of the memory access provided by fbdev is that it's an extremely simple API. In other words an outdated abstraction that's still useful so it's kept around.
An example of this complexity is video streams utilizing hardware accelerated decoding. Those often won't show up in screenshots, or if they do they might be out of sync or otherwise not quite what you saw on screen because the driver is attempting to construct a single cohesive snapshot for you where one never actually existed in the first place.
If I got anything wrong please let me know. My familiarity generally stops at the level of the various cross platform APIs (opengl, vulkan, opencl, etc).
In practice, it's not literally that, but in practice, it acts/works like that.
[1] https://gitlab.freedesktop.org/mesa/kmscube
If you're ever bored, from a TTY, type
sudo dd if=/dev/urandom of=/dev/fb0
This provides a nifty demonstration of how both the framebuffer and urandom works.
you can also take a sort of "screenshot" in a tty by typing dd if=/dev/fb0 of=./shot.fb
and then you can view it by flipping those filenames around, so that the shot.fb is now the input and /dev/fb0 is now the output.