Back in the 90s, I implemented precompiled headers for my C++ compiler (Symantec C++). They were very much like modules. There were two modes of operation:
1. all the .h files were compiled, and emitted as a binary that could be rolled in all at once
2. each .h file created its own precompiled header. Sounds like modules, right?
Anyhow, I learned a lot, mostly that without semantic improvements to C++, while it made compilation much faster, it was too sensitive to breakage.
This experience was rolled into the design of D modules, which work like a champ. They were everything I wanted modules to be. In particular,
The semantic meaning of the module is completely independent of wherever it is imported from.
Anyhow, C++ is welcome to adopt the D design of modules. C++ would get modules that have 25 years of use, and are very satisfying.
Yes, I do understand that the C preprocessor macros are a problem. My recommendation is, find language solutions to replace the preprocessor. C++ is most of the way there, just finish the job and relegate the preprocessor to the dustbin.
cogman10 · 20m ago
> just finish the job and relegate the preprocessor to the dustbin.
Yup, I think this is the core of the problem with C++. The standards committee has drawn a bad line that makes encoding the modules basically impossible. Other languages with good module systems and fast incremental builds don't allow for preprocessor style craziness without some pretty strict boundaries. Even languages that have gotten it somewhat wrong (such as rust with it's proc macros) have bound where and how that sort of metaprogramming can take place.
Even if the preprocessor isn't dustbined, it should be excluded from the module system. Metaprogramming should be a feature of the language with clear interfaces and interactions. For example, in Java the annotation processor is ultimately what triggers code generation capabilities. No annotation, no metaprogramming. It's not perfect, but it's a lot better than the C/C++'s free for all macro system.
Or the other option is the go route. Don't make the compiler generate code, instead have the build system be responsible for code generation (calling code generators). That would be miles better as it'd allow devs to opt in to that slowdown when they need it.
Calavar · 1h ago
Did anyone reach out to you for input during the modules standardization process? D seems like the most obvious prior art, but the modules standardization process seems like it was especially cursed
fooker · 1h ago
What has to change in C++ templates for this to work?
It seems particularly tricky to define a template in a module and then instantiate it or specialize it somewhere else.
modeless · 1h ago
The sensible way to speed up compilation 5x was implemented almost 10 years ago, worked amazingly well, and was completely ignored. I don't expect progress from the standards committees. Here it is if you're interested: https://github.com/yrnkrn/zapcc
The next major advance to be completely ignored by standards committees will be the 100% memory safe C/C++ compiler, which is also implemented and works amazingly well: https://github.com/pizlonator/fil-c
throwawaty76543 · 11m ago
> The sensible way to speed up compilation 5x was implemented almost 10 years ago, worked amazingly well, and was completely ignored. I don't expect progress from the standards committees. Here it is if you're interested: https://github.com/yrnkrn/zapcc
Of course it was completely ignored. Did you expect the standards committee to enforce caching in compilers? That's just not its job.
> The next major advance to be completely ignored by standards committees will be the 100% memory safe C/C++ compiler, which is also implemented and works amazingly well: https://github.com/pizlonator/fil-c
Again—do you expect the standards committee to enforce usage of this compiler or what? The standards commitee doesn't "standardize" compilers...
dgan · 29m ago
what is this sorcery. I was reading HN for years, this is the first time I see someone brings up a memory safe C++. how is that not even on the headlines ? what's the catch, build times ? do I have to sell my house to get it?
EDIT: Oh, found the tradeoff:
hollerith on Feb 21, 2024 | prev | next [–]
>Fil-C is currently about 200x slower than legacy C according to my tests
modeless · 24m ago
The catch is performance. It's not 200x slower though! 2x-4x is the actual range you can expect. There are many applications where that could be an acceptable tradeoff for achieving absolute memory safety of unmodified C/C++ code.
But also consider that it's one guy's side project! If it was standardized and widely adopted I'm certain the performance penalty could be reduced with more effort on the implementation. And I'm also sure that for new C/C++ code that's aware of the performance characteristics of Fil-C that we could come up with ways to mitigate performance issues.
munificent · 11m ago
I mean, if I could accept a 2x-4x performance hit, then I wouldn't be using C++ in the first place. At that point, there are any of a number of other languages that are miles more pleasant to program in.
emtel · 24m ago
I did C++ for over 10 years, and now have been doing rust for about 4. On the whole, I like rust much better, but I really miss header files.
Modules are horrible for build times. If you change an implementation (i.e something that would not normally involve editing a header) the amount of rebuilding that happens is crazy, compared to any C++ project that was set up with a minimal amount of care.
StephenHerlihyy · 20m ago
Modules provide more than just speed. Compile time benefits are great and the article is right about build-time bloat being the bane of every developer. But modules serve a deeper purpose.
Explicit sub-unit encapsulation. True isolation. No more weird forward declarations, endlessly nested ifdef guards, or insane header dependency graphs. Things just exist as they are separate, atomic, deterministic and reliable.
Modules probably need a revision, and yes, adoption has been slow, but once you start using modules you will never go back. The clarity of explicitly declared interfaces and the freedom from header hell fundamentally changes how you think about organizing C++ code.
Start a new project with modules if you don’t believe me. Try them. That is the largest barrier right now - downstream use. They are not supported because they are not used and they are not used because they are not well supported. But once you use them you will start to eagerly await every new compiler release in hopes of them receiving the attention they deserve.
psyclobe · 36m ago
Modules are unusable in real projects with real dependencies. Until that changes there’s no chance I’ll look at them.
nurettin · 1h ago
If you avoid circular refs and forward declarations by writing hierarchical code, you won't really need modules. In my current project I'm 20+ headers in and still haven't had a circular ref. Just refactor all the commonly used code, decouple with callbacks and you have yourself a nice clean header only library that you can amalgamate and/or precompile.
forrestthewoods · 1h ago
> modules should be killed and taken out of the standard.
Yes. The C++ standard is an abomination that is both over-specified and wildly under-specified.
Modules are simply never going to work en masse. It is a failed idea.
Why bother? The world seems to have moved on to Rust, C++ is only for legacy maintenance stuff anymore.
c0balt · 1h ago
I'm pretty sure a lot of fields would disagree with you. Last I checked game programming, OS development, embedded development and more were deeply invested in either C or more often C++, especially when RT tasks were involved or vendor-provided compilers are required.
Rust just doesn't have close to the same type of adoption/support yet, especially when considering various embedded platforms.
holowoodman · 1h ago
There is a big big difference between C and C++. The article is about C++, which is being replaced by Rust imho. C is different, and far more frequently used for embedded or OS development. Don't know about games, but last I looked, stuff like unity was C#, so something else yet again.
grg0 · 59m ago
Game engines (the thing that C# runs on top of) are written in C++. As is the C#/.NET runtime. As is anything that requires careful management of memory for performance. Application code is perfectly reasonably written in managed languages, but not so for the things that run underneath.
jonathrg · 1h ago
C++ is widely used in embedded. Most compilers support it. Usually you turn some things off (e.g. -fno-rtti, -fno-exceptions) and try to stick to some sane subset of the ++.
drwu · 56m ago
It is a pitty that
- GCC switched from C to C++
- CUDA switched from C to C++
But I can understand the decision, and at that time , C++ frontend features and libs were a little bit less horrible.
okanat · 1h ago
Newer stuff yes and it is great. However from basic rendering to browsers and the most complex applications (CAD, office software, finance, complex solvers like airline planning) are still in C++. Nobody will accept rewriting 35+ years of history.
C++ code bases are really a lot longer-lived than any other software builds upon them. Hence we cannot drop it.
Starting with C++17, I think committee has been doing the language a disservice and piled even more and more unintended complexity by rushing "improvements" like modules.
I don't write C++ anymore due to my team switching to Rust and C only (for really old stuff and ABI). I am really scared of having to return though. Not because I am spoiled by Rust (I am though), but because catching up with all the silly things they added on top and how they interact with earlier stuff like iterators. C++ is a perfect language for unnecessary pitfalls. Newer standards just exploded this complexity.
There are a number of areas in programming where I'd always choose C++ over Rust - gameplay programming, retained-mode GUI programming and interpreted programming languages to name a few have very complex circular memory models that are somewhat solvable with weak_ptrs or refs stored in member variables passed through constructors but would be absolutely obnoxious to deal with and get right with the borrow checker.
whatevaa · 1h ago
Borrow checker is actually not very good solution for some domains. I remember reading that it would not be good for writing javascript VM like v8. Something about managing memory which lifetime actually depends on others, not rust code.
fooker · 1h ago
C++ has a habit of incorporating all the best ideas from it's 'succcessor' languages once these ideas are mature.
Philpax · 1h ago
A lot of the problems with C++ are more foundational; you can't adopt the changes that newer languages have made, because that would be a new language - and we know this, because that new language's name is Carbon.
There are things you can add, but the rot still permeates the foundations, and much of the newness goes partially unused because they're just not at home in C++. Use of `std::optional` and `std::variant` is, as far as I know, still limited, even in newer C++ code, because the ergonomics just aren't there.
fooker · 1h ago
optional is heavily used in new codebases.
variant isn't, yet. We'll eventually get some kind of structural pattern matching that will make variant or it's successor more idiomatic.
C++ does have quite a bit of rot, you're right. But that's the price of building technology people actually use.
Carbon doesn't seem to have any fundamentally new ideas, we'll see how well it fares in the wild.
InCom-0 · 55m ago
Have a look at any serious job postings.
C++ jobs outnumber Rust jobs somewhere around 50:1.
Internet hype meets actual industry reality :-).
boppo1 · 1h ago
Sarcasm?
grg0 · 1h ago
These comments only reflect the person's narrow view of the "world", along with a generous dosage of pride to state things about which one does not know. If every OS and driver and much of the backbone of society is "legacy maintenance stuff", then the statement is certainly correct.
bukotron · 1h ago
Rust does not solve any problem existing in expecienced C++ developer career. You dont write modern C++ code in such a way memory leak or dangling pointer is possible at all)
This is exactly WHY we dont see a rush movement of C++ developers to Rust throwing away everything for Rust. Rust is trying to solve problems that already not exist 99.9999% of time in modern C++ code style and standards.
Also, some day C++ compilers or tooling will get its own Borrow Checker to completely forget about Rust - this will be done just for fun just to stop arguing with rust-fans :)
okanat · 1h ago
This is a very bad take. Rust does solve very real problems in C family languages.
The number of people I met in Rust conferences that rewriting at least parts of rather big C++ codebases weren't small either.
However, there is still big amount of code that is purely C++. Many of the older code bases still use C++03-style code too. Or they were written in the OOP design pattern golden era that requires huge reactors to adapt functional / modern code. Anything with Qt will not benefit from smart pointers. Even with Qt 6.
Rust cannot solve these problems since the challenges are not purely technical but social too.
r_lee · 31m ago
I would say though that Rust has already had a profound social effect which has probably enabled those rewrites etc. It wasn't too long ago that it was brushed aside as noise yet now its gaining real momentum
Philpax · 1h ago
Thinking that experience with C++'s many flaws will save you from running into them is delusional - just look at the number of CVEs in projects maintained by world-class C++ programmers.
No amount of fallible human vigilance will stop you from forgetting the existence of a C++ quirk in the code you're rushing out before heading out for the night. Human oversight does not scale.
InCom-0 · 41m ago
This is true ... except that Rust doesn't actually do any better in that regard.
Rust solves 1 category of problems in a way that is not without its costs and other consequences. That is it. There are projects where this is very important, there are other projects where its virtually useless and the consequences just get in the way. It is not magic. It doesn't make anything actually 'safe'.
1. all the .h files were compiled, and emitted as a binary that could be rolled in all at once
2. each .h file created its own precompiled header. Sounds like modules, right?
Anyhow, I learned a lot, mostly that without semantic improvements to C++, while it made compilation much faster, it was too sensitive to breakage.
This experience was rolled into the design of D modules, which work like a champ. They were everything I wanted modules to be. In particular,
The semantic meaning of the module is completely independent of wherever it is imported from.
Anyhow, C++ is welcome to adopt the D design of modules. C++ would get modules that have 25 years of use, and are very satisfying.
Yes, I do understand that the C preprocessor macros are a problem. My recommendation is, find language solutions to replace the preprocessor. C++ is most of the way there, just finish the job and relegate the preprocessor to the dustbin.
Yup, I think this is the core of the problem with C++. The standards committee has drawn a bad line that makes encoding the modules basically impossible. Other languages with good module systems and fast incremental builds don't allow for preprocessor style craziness without some pretty strict boundaries. Even languages that have gotten it somewhat wrong (such as rust with it's proc macros) have bound where and how that sort of metaprogramming can take place.
Even if the preprocessor isn't dustbined, it should be excluded from the module system. Metaprogramming should be a feature of the language with clear interfaces and interactions. For example, in Java the annotation processor is ultimately what triggers code generation capabilities. No annotation, no metaprogramming. It's not perfect, but it's a lot better than the C/C++'s free for all macro system.
Or the other option is the go route. Don't make the compiler generate code, instead have the build system be responsible for code generation (calling code generators). That would be miles better as it'd allow devs to opt in to that slowdown when they need it.
It seems particularly tricky to define a template in a module and then instantiate it or specialize it somewhere else.
The next major advance to be completely ignored by standards committees will be the 100% memory safe C/C++ compiler, which is also implemented and works amazingly well: https://github.com/pizlonator/fil-c
Of course it was completely ignored. Did you expect the standards committee to enforce caching in compilers? That's just not its job.
> The next major advance to be completely ignored by standards committees will be the 100% memory safe C/C++ compiler, which is also implemented and works amazingly well: https://github.com/pizlonator/fil-c
Again—do you expect the standards committee to enforce usage of this compiler or what? The standards commitee doesn't "standardize" compilers...
EDIT: Oh, found the tradeoff:
hollerith on Feb 21, 2024 | prev | next [–]
>Fil-C is currently about 200x slower than legacy C according to my tests
But also consider that it's one guy's side project! If it was standardized and widely adopted I'm certain the performance penalty could be reduced with more effort on the implementation. And I'm also sure that for new C/C++ code that's aware of the performance characteristics of Fil-C that we could come up with ways to mitigate performance issues.
Modules are horrible for build times. If you change an implementation (i.e something that would not normally involve editing a header) the amount of rebuilding that happens is crazy, compared to any C++ project that was set up with a minimal amount of care.
Explicit sub-unit encapsulation. True isolation. No more weird forward declarations, endlessly nested ifdef guards, or insane header dependency graphs. Things just exist as they are separate, atomic, deterministic and reliable.
Modules probably need a revision, and yes, adoption has been slow, but once you start using modules you will never go back. The clarity of explicitly declared interfaces and the freedom from header hell fundamentally changes how you think about organizing C++ code.
Start a new project with modules if you don’t believe me. Try them. That is the largest barrier right now - downstream use. They are not supported because they are not used and they are not used because they are not well supported. But once you use them you will start to eagerly await every new compiler release in hopes of them receiving the attention they deserve.
Yes. The C++ standard is an abomination that is both over-specified and wildly under-specified.
Modules are simply never going to work en masse. It is a failed idea.
Not great!
Rust just doesn't have close to the same type of adoption/support yet, especially when considering various embedded platforms.
- GCC switched from C to C++
- CUDA switched from C to C++
But I can understand the decision, and at that time , C++ frontend features and libs were a little bit less horrible.
C++ code bases are really a lot longer-lived than any other software builds upon them. Hence we cannot drop it.
Starting with C++17, I think committee has been doing the language a disservice and piled even more and more unintended complexity by rushing "improvements" like modules.
I don't write C++ anymore due to my team switching to Rust and C only (for really old stuff and ABI). I am really scared of having to return though. Not because I am spoiled by Rust (I am though), but because catching up with all the silly things they added on top and how they interact with earlier stuff like iterators. C++ is a perfect language for unnecessary pitfalls. Newer standards just exploded this complexity.
There are things you can add, but the rot still permeates the foundations, and much of the newness goes partially unused because they're just not at home in C++. Use of `std::optional` and `std::variant` is, as far as I know, still limited, even in newer C++ code, because the ergonomics just aren't there.
variant isn't, yet. We'll eventually get some kind of structural pattern matching that will make variant or it's successor more idiomatic.
C++ does have quite a bit of rot, you're right. But that's the price of building technology people actually use.
Carbon doesn't seem to have any fundamentally new ideas, we'll see how well it fares in the wild.
Internet hype meets actual industry reality :-).
This is exactly WHY we dont see a rush movement of C++ developers to Rust throwing away everything for Rust. Rust is trying to solve problems that already not exist 99.9999% of time in modern C++ code style and standards.
Also, some day C++ compilers or tooling will get its own Borrow Checker to completely forget about Rust - this will be done just for fun just to stop arguing with rust-fans :)
The number of people I met in Rust conferences that rewriting at least parts of rather big C++ codebases weren't small either.
However, there is still big amount of code that is purely C++. Many of the older code bases still use C++03-style code too. Or they were written in the OOP design pattern golden era that requires huge reactors to adapt functional / modern code. Anything with Qt will not benefit from smart pointers. Even with Qt 6.
Rust cannot solve these problems since the challenges are not purely technical but social too.
No amount of fallible human vigilance will stop you from forgetting the existence of a C++ quirk in the code you're rushing out before heading out for the night. Human oversight does not scale.
Rust solves 1 category of problems in a way that is not without its costs and other consequences. That is it. There are projects where this is very important, there are other projects where its virtually useless and the consequences just get in the way. It is not magic. It doesn't make anything actually 'safe'.