Advantages of using this library are that it is uses intrinsics (SIMD) to accelerate operations. There is a lot of Microsoft money & time that has been invested into these code piles.
qingcharles · 1d ago
I also see the guys from Intel constantly stabbing at all these low-level types to optimize them too. There are optimizations in .NET 10 for processors that aren't even released yet.
marhee · 22h ago
I suspect it's part of the fun? A way to really learn something?
There's also another hint:
// THIS FILE WAS AUTO-GENERATED.
// CHANGES WILL NOT BE PROPAGATED.
// ----------------------------------------------------------------------------
(Of course this could be a result of something having nothing to do with the contents of the file, but maybe the author has to meta library that can generate the types in different languages).
There seems to be fixed-precision variants of the vector types as well which seems to be not available in the .NET framework.
Plus, of course, you can't add your specifics needs to library types (like the fixed precision). They are closed to modification.
I am just guessing, of course.
That being said, it would also make total sense to use the .NET types.
jabart · 1d ago
It looks like these types were code-gen from something else.
It heavily uses numerics and the performance is amazing.
materialpoint · 20h ago
Historically the .NET and XNA vector types have been seriously lacking for real graphics development, and they still don't even provide swizzling. It's likely that this project predates .NET numerics by many years, and anyone who has had a pet project for long enough will learn to avoid becoming too dependent on libraries and platforms that will die out.
energywut · 1d ago
I wonder if it can play through MyHouse.wad. Which, if you haven't seen before, is an incredible art piece.
Thank you for the, so far, half-day rabbit hole...
Agreed, it is an incredible art piece, and now I want to go find a copy of House of Leaves.
low_tech_love · 1d ago
He used Eviternity 2 as an example, my personal GOTY for 2024. Check it out if you haven’t!
tines · 1d ago
Oh man, House of Leaves is amazing. Danielewski has my respect.
wiseowise · 1d ago
I’m also curious about this.
ngrilly · 1d ago
Finally, a good example of a modern C# code base that is open source, and that doesn't look like the equivalent of J2EE in C#.
aeonik · 18h ago
Thanks for your comment, made me want to check it out, and yea it's really clean code.
tester756 · 1d ago
I wanted to provide link to Ryujinx repo, but I've found that Nintendo threaten them and they had to close the project :(
necrosyne · 1d ago
Allegedly, Nintendo chose to buy them off with millions of dollars to suspend development instead of pursuing legal action.
lrae · 1d ago
Is there any source on that?
All I'm seeing is they got their hands on the domain, which can be (and was in the past) just part of whatever settlement they agreed on, and the game press spinned that into "Nintendo bought Ryujinx".
Cieric · 1d ago
This looks interesting and I'm going to take a look later. Just a minor nitpick up front though, I think the performance graph should be a bar graph instead of a line graph. Mainly since the in-between states don't have much meaning as you can't be half way between 2 different gpus.
ogurechny · 1d ago
Those discussions are a bit misleading. Original Doom updates its state only 35 times a second, and ports that need to remain compatible must follow that (though interpolation and prediction tricks are possible for visual smoothing of the movement). Rendering engine is also completely orthogonal to polygon-based 3D accelerators, so all their power is unused (apart from, perhaps, image buffers in fast memory and hardware compositing operations). Performance on giant maps therefore depends on CPU speed. The point of this project is making the accelerator do its job with a new rendering process.
Though I wonder how sprites, which are a different problem orthogonal to polygonal rendering, are handled. So, cough cough, Doxylamine Moon benchmarks?
kfuse · 1d ago
"Rendering engine is also completely orthogonal to polygon-based 3D accelerators"
Software rendering engine, yes (and even then you can parallelize it). But there is really no reason why doom maps can't be broken down in polygons. Proper sprite rendering is a problem, though.
ogurechny · 1d ago
Sure, that has been done since the late '90s release of the source code, both by converting visible objects to triangles to be drawn by the accelerator (glDoom, DoomGL), or by transplanting game data and mechanics code into an existing 3D engine (Vavoom used recently open-sourced Quake).
However, proper recreation of the original graphics would require shaders and much more modern extensive and programmable pipelines, while the relaxed artistic attitude (or just contemporary technical limitations) unfortunately resulted in trashy y2k amateur 3D shooter look. Leaving certain parts to software meant that CPU had to do most of the same things once again. Also, 3D engines were seen as a base for exciting new features (arbitrary 3D models, complex lighting, free camera, post-processing effects, etc.), so the focus shifted in that direction.
In general, CPU performance growth meant that most PCs could run most Doom levels without any help from the video card. (Obviously, map makers rarely wanted to work on something that was too heavy for their systems, so the complexity was also limited by practical reasons.) 3D rendering performance (in non-GZDoom ports) was boosted occasionally to enable some complex geometry or mapping tricks in popular releases, but there was little real pressure to use acceleration. On the other hand, the linear growth of single core performance has stopped long ago, while the urges of map makers haven't, so there might be some need for “real” complete GPU-based rendering.
kfuse · 1d ago
As I said, traditional doom bsp-walker software renderer is quite parallelizable. You can split the screen vertically into several subscreens and render them separately (does wonders for epic maps). The game logic, or at least most of it, can probably be run in parallel with the rendering.
And I don't think any of the above is necessary. Even according to their graphs popular doom ports can render huge maps at sufficiently high fps on reasonably modern hardware. The goal of this project, as stated in the doomworld thread, is to be able to run epic maps on a potato.
atmavatar · 1d ago
Even just updating the graphs would be helpful. There appear to have been several releases since 0.9.2.0, including a bump from .NET 7 to .NET 8 (and a bump to .NET 9 in dev).
The more recent .NET versions by themselves are likely to have some impact on the performance, let alone any changes in Helion code between versions.
kevingadd · 1d ago
Might make sense to use a logarithmic scale for the graphs too, it's hard to tell what speed the other ones are since they're compressed so far down.
_0ffh · 1d ago
It's a Doom engine, and they missed the opportunity to call it "Hellion"??
bigbuppo · 1d ago
And with the name Hellion they could also go down the Judas Priest rabbit hole.
dimitropoulos · 1d ago
the Doom in TypeScript types project wouldn't have been possible without Nick and Helion - I owe Nick a huge thanks! He helped with some of the more obscure parts of the engine and also helped make a super small WAD that is what the game eventually ran in.
Legend.
LoganDark · 16h ago
Doom in TypeScript types is amazing. Thank you for losing your mind for the rest of us :)
yodon · 1d ago
Impressive C# performance!
dax_ · 1d ago
Microsoft has really been putting a lot of focus on improving it with each release. I love reading through the blog articles for each major release, that outline all the performance improvements that were done: https://devblogs.microsoft.com/dotnet/performance-improvemen...
runevault · 1d ago
A warning for those not in the know, the performance improvement posts famously give mobile browsers trouble because they are so massive. All because the extent of the improvements is so great (along with the amount of detail the posts go into about the improvements).
ddingus · 1d ago
I just viewed the one linked above, and the coupla second render delay at first aside, the post displayed nicely, at full frame rate.
Old Note 9, Chrome and Firefox.
Non flagship mobile devices could very well choke on one of those pages, but most newer devices should display these pages with little grief.
runevault · 1d ago
Interesting, I thought I saw the usual complaints even in the past year.
qingcharles · 1d ago
And if you look at the PRs for the core, there are Intel people hacking away at the low-level routines too; to make it run better on their latest server CPUs.
It can be quite performant these days, sadly I'm stuck developing in Unity Mono C# which is quite a bit behind.
bee_rider · 1d ago
Finally I can play Doom on my 2khz monitor.
dataflow · 1d ago
2 kHz monitor? Is that a joke, or a typo, or real?
bee_rider · 1d ago
A joke.
Although actually now that you ask, monitor speeds are getting pretty crazy, so I guess it isn’t implausible enough!
nartho · 1d ago
Well TCL unveiled the first khz monitor so my guess is they'll get there soon
nottorp · 20h ago
Real or advertised monitor speeds? :)
mawadev · 1d ago
The Benchmarks look a bit sketchy... is the frame uncapped for all the other engines and has vsync been disabled? It's a very odd graph to look at, but great performance regardless
I have seen some use case for MemoryStream, why not use RecyclableMemorysStream instead?
thomasqbrady · 1d ago
How does licensing work, here... could you use this to develop an indie game and sell it?
gr4vityWall · 1d ago
Yes. Only requirement is that your game code is Free Software (GPLv3).
kasajian · 1d ago
Not sure what the question is. The License is clearly stated.
gcr · 1d ago
No, it’s GPL3, so your game must be open-source.
If the authors wanted to protect engine development while allowing indies to sell games made on it, they would have picked LGPL or a more permissive license.
detaro · 1d ago
since when do people not sell GPL games?
energywut · 1d ago
You are technically correct, and I believe the GPL doesn't cover the assets for the game (levels, art, audio, etc.), but I suspect there aren't many GPL licensed games out there for sale that have sold enough copies to make developing them worthwhile financially.
I'd love to be wrong, so if you have a few examples, I'm all ears.
mjr00 · 1d ago
Probably not much in the AA/AAA space, but plenty of indies. The Doom engine (and GZDoom, which is the most common Doom engine derivative) is GPL and there have been multiple commercially successful games released using it. I know at least Hedon[0] and Hands of Necromancy[1] sold enough copies to warrant a sequel.
GPL vs LGPL definitely isn't a blocker for a commercial game, in any case.
Remember the GPL only applies to the code you can make a great game with beautiful artwork and distribute the source code to anyone who wants it. Nobody playing the game will have much fun without the artwork.
nottorp · 20h ago
Carmack has a post from ages ago wondering why no one does that with the ID engines they open sourced, which were pretty current back then. He was talking about the quake (2?) source code dumps i think.
The GPL license will allow people to take the Quake 3 engine and even go so far as to release a commercial product with it - provided that the source code is published alongside. Nobody has done this with any of the Quake engine games yet, but he hopes to see it happen someday.
energywut · 1d ago
That's literally the first sentence I wrote in my comment. ;)
nailer · 1d ago
Sorry, you're right. I somehow missed that. There's some indie boomer shooters using the Doom engine that are commercially licensed IIRC.
whizzter · 18h ago
You can sell them on PC, but any dream of console releases are dead in the water as Sony,etc forbids distribution or even code using their SDK's to be shared publicly.
patrick4urcloud · 1d ago
i will give a try.
neuroelectron · 1d ago
FPSes aught to update the screen every millisecond. Why isn't this more common?
imbusy111 · 1d ago
That would require a 1000Hz screen to even be able to output. 144Hz is the high-end today. It seems to be pointless to push beyond that.
c-hendricks · 1d ago
240/360/480/540hz monitors all exist. 144hz is not that high a bar anymore, and kind of the odd one out given that it's not a multiple of 60.
matja · 5h ago
Or you could say 540Hz is the odd one out for not being a multiple of 24Hz, the standard since ~1930.
theblazehen · 22h ago
Because you quickly get into diminishing returns, while significantly scaling the hardware required?
All of the methods defined here:
https://github.com/Helion-Engine/Helion/blob/20300d89ee4091c...
Are available in the kitchen sink:
https://learn.microsoft.com/en-us/dotnet/api/system.numerics...
Same idea applies to methods like GetProjection, which could be replaced with methods like:
https://learn.microsoft.com/en-us/dotnet/api/system.numerics...
Advantages of using this library are that it is uses intrinsics (SIMD) to accelerate operations. There is a lot of Microsoft money & time that has been invested into these code piles.
There's also another hint:
// THIS FILE WAS AUTO-GENERATED. // CHANGES WILL NOT BE PROPAGATED. // ----------------------------------------------------------------------------
(Of course this could be a result of something having nothing to do with the contents of the file, but maybe the author has to meta library that can generate the types in different languages).
There seems to be fixed-precision variants of the vector types as well which seems to be not available in the .NET framework.
Plus, of course, you can't add your specifics needs to library types (like the fixed precision). They are closed to modification.
I am just guessing, of course.
That being said, it would also make total sense to use the .NET types.
https://github.com/Helion-Engine/Helion/commit/e6affd9abff14...
Someone should learn some CS and then how to make proper commits.
Additionally this is generated code.
It heavily uses numerics and the performance is amazing.
https://www.youtube.com/watch?v=5wAo54DHDY0
If you've read House of Leaves, do yourself a favor and check it out.
Agreed, it is an incredible art piece, and now I want to go find a copy of House of Leaves.
All I'm seeing is they got their hands on the domain, which can be (and was in the past) just part of whatever settlement they agreed on, and the game press spinned that into "Nintendo bought Ryujinx".
Though I wonder how sprites, which are a different problem orthogonal to polygonal rendering, are handled. So, cough cough, Doxylamine Moon benchmarks?
Software rendering engine, yes (and even then you can parallelize it). But there is really no reason why doom maps can't be broken down in polygons. Proper sprite rendering is a problem, though.
However, proper recreation of the original graphics would require shaders and much more modern extensive and programmable pipelines, while the relaxed artistic attitude (or just contemporary technical limitations) unfortunately resulted in trashy y2k amateur 3D shooter look. Leaving certain parts to software meant that CPU had to do most of the same things once again. Also, 3D engines were seen as a base for exciting new features (arbitrary 3D models, complex lighting, free camera, post-processing effects, etc.), so the focus shifted in that direction.
In general, CPU performance growth meant that most PCs could run most Doom levels without any help from the video card. (Obviously, map makers rarely wanted to work on something that was too heavy for their systems, so the complexity was also limited by practical reasons.) 3D rendering performance (in non-GZDoom ports) was boosted occasionally to enable some complex geometry or mapping tricks in popular releases, but there was little real pressure to use acceleration. On the other hand, the linear growth of single core performance has stopped long ago, while the urges of map makers haven't, so there might be some need for “real” complete GPU-based rendering.
And I don't think any of the above is necessary. Even according to their graphs popular doom ports can render huge maps at sufficiently high fps on reasonably modern hardware. The goal of this project, as stated in the doomworld thread, is to be able to run epic maps on a potato.
The more recent .NET versions by themselves are likely to have some impact on the performance, let alone any changes in Helion code between versions.
Legend.
Old Note 9, Chrome and Firefox.
Non flagship mobile devices could very well choke on one of those pages, but most newer devices should display these pages with little grief.
It can be quite performant these days, sadly I'm stuck developing in Unity Mono C# which is quite a bit behind.
Although actually now that you ask, monitor speeds are getting pretty crazy, so I guess it isn’t implausible enough!
If the authors wanted to protect engine development while allowing indies to sell games made on it, they would have picked LGPL or a more permissive license.
I'd love to be wrong, so if you have a few examples, I'm all ears.
GPL vs LGPL definitely isn't a blocker for a commercial game, in any case.
[0] https://github.com/madame-rachelle/hgzdoom https://store.steampowered.com/app/1072150/Hedon_Bloodrite/
[1] https://store.steampowered.com/app/1898610/Hands_of_Necroman...
Edit: ohh i found it:
http://www.gamespy.com/articles/641/641662p6.html
The GPL license will allow people to take the Quake 3 engine and even go so far as to release a commercial product with it - provided that the source code is published alongside. Nobody has done this with any of the Quake engine games yet, but he hopes to see it happen someday.