Learn Makefiles

229 dsego 117 6/20/2025, 8:05:55 AM makefiletutorial.com ↗

Comments (117)

bsenftner · 7h ago
Way back in the dark ages of 1985, I encountered a guy at the Boston University Graphics lab that was using Makefiles to drive the generation of a 3D renderer for animation. He was a Lisp guy, doing early procedural generation and 3D actor systems. His Makefile was extremely elegant, about 10 lines total. It generated hundreds of animations, all based on the simple file date dependency. He had Lisp generating the 3d form for each frame, and them Make would generate the frames. This being '85, pre pretty much everything we take for granted with 3D and animation, the guy was blowing everyone's mind. He went on to write the 3D renderer for Iron Giant, and was key in Caroline too, I seem to remember. Brian Gardner.
agumonkey · 6h ago
bsenftner · 6h ago
Yep, that's Brian. We've not spoken in years, but I've known him since '85.
agumonkey · 4h ago
Maybe one day we'll have a chance to see his 3d rendering makefile-fu :)

thanks for the story nonetheless

1vuio0pswjnm7 · 1h ago
"His Makefile was elegant, about 10 lines in total."

It sounds like this was before BSD or Linux. Now we have these silly incompatabilities.

adhamsalama · 7h ago
You mean Coraline, right?
bsenftner · 6h ago
Yes.
stabbles · 9h ago
A couple make flags that are useful and probably not very well known:

Output synchronization which makes `make` print stdout/stderr only once a target finishes. Otherwise it's typically interleaved and hard to follow:

    make --output-sync=recurse -j10
On busy / multi-user systems, the `-j` flag for jobs may not be best. Instead you can also limit parallelism based on load average:

    make -j10 --load-average=10
Randomizing the order in which targets are scheduled. This is useful for your CI to harden your Makefiles and see if you're missing dependencies between targets:

    make --shuffle # or --shuffle=seed/reverse
blueflow · 6h ago
> and probably not very well known

Maybe the make authors could compile a list of options somewhere and ship it with their program, so users could read them? Something like a text file or using some typesetting language. This would make that knowledge much more accessible.

moefh · 5h ago
Not sure if you're being snarky, but the manual has a list of all options accepted by make: https://www.gnu.org/software/make/manual/html_node/Options-S...

(`make --help` will only print the most common options)

tux1968 · 5h ago
make --help

Will give you the command line options. And GNU make has decent documentation online for everything else:

https://www.gnu.org/software/make/manual/html_node/index.htm...

f1shy · 8h ago
The one that I most use is -B for unconditional build all
monkeyelite · 4h ago
> On busy / multi-user systems

Can’t the OS scheduler handle it?

davemp · 7h ago
I’ve seen and had ‘make -j’ dos machines enough times that I consider it a bug.
bayindirh · 7h ago
If "make -j" successfully drowns a machine, I can argue that the machine has no serious bottlenecks for the job. Because, make is generally I/O bound when run with high parallelism, and if you can't saturate your I/O bandwidth, that's a good thing in general.

However, if "make -j" is saturates a machine, and this is unintentional, I'd assume PEBKAC, or "holding it wrong", in general.

davemp · 6h ago
The problem is ‘make -j’ spinning up 100s of C++ compilation jobs, using up all of the systems RAM+swap, and causing major instability.

I get that the OS could mitigate this, but that’s often not an option in professional settings. The reality is that most of the time users are expecting ‘make -j $(N_PROC)’, get bit in the ass, and then the GNU maintainers say PEBKAC—wasting hundreds of hours of junior dev time.

dspillett · 5h ago
> The problem is ‘make -j’ spinning up 100s of C++ compilation jobs, using up all of the systems RAM+swap, and causing major instability.

I would put that in the “using it improperly” category. I never use⁰ --jobs without specifying a limit.

Perhaps there should have been a much more cautious default instead of the default being ∞, maybe something like four¹, or even just 2, and if people wanted infinite they could just specify something big enough to encompass all the tasks that could possibly run in the current process. Or perhaps --load-average should have defaulted to something like min(2, CPUs×2) when --jobs was in effect⁴.

The biggest bottleneck hit when using --jobs back then wasn't RAM or CPU though, it was random IO on traditional high-latency drives. A couple of parallel jobs could make much better use of even a single single-core CPU, by the CPU-crunching of a CPU-busy task or two and the IO of other tasks ending up parallel, but too many concurrent tasks would result in an IO flood that could practically stall the affected drives for a time, putting the CPU back into a state of waiting ages for IO (probably longer than it would be without multiple jobs running) - this would throttle a machine² before it ran out of RAM even with the small RAM we had back then compared to today. With modern IO and core counts, I can imagine RAM being the bigger issue now.

--------

[0] Well, used, I've not touched make for quite some time

[1] Back when I last used make much at all small USB sticks and SD cards were not uncommon, but SSDs big++quick+hardy enough for system or work drives were an expensive dream. With frisby-based drives I found a four job limit was often a good compromise, approaching but not hitting significantly diminishing returns if you had sufficient otherwise unused RAM, while keeping a near-zero chance of effectively completely stalling the machine due to a flood of random IO.

[2] Or every machine… I remember some fool³ bogging down the shared file server of most of the department with a vast parallel job, ignoring the standing request to run large jobs on local filesystems where possible anyway.

[3] Not me, I learned the lesson by DoSing my home PC!

[4] Though in the case of causing an IO storm on a remote filesystem, a load-average limit might be much less effective.

davemp · 5h ago
Thanks for the historical perspective. It probably was less of an issue on older hardware because you can ctrl-c if you’re IO starved. Linux user spaces do not do well when the OOM killer comes out to play.

Personally, I don’t think these footguns need to exist.

bayindirh · 6h ago
I’ll kindly disagree on wasting junior developer time. Any person who’s using tools professionally should read (or at least skim) the manual of the said tool. Especially, if it’s something foundational to their all workflow.

They are junior because they are inexperienced, but being junior is the best place to make mistakes and learn good habits.

If somebody asks what is the most important thing I have learnt over the years, I’d say “read the manual and the logs”.

davemp · 5h ago
There’s a difference between understanding your tool and unnecessary cognitive load.

Make does not provide a sane way to run in parallel. You shouldn’t have to compose a command that parses /proc/cpuinfo to get the desired behavior of “fully utilize my system please”. This is not a detail that is particularly relevant to conditional compilation/dependency trees.

This feels like it’s straight out of the Unix Haters Handbook.

[0]: https://web.mit.edu/~simsong/www/ugh.pdf see p186

duped · 3h ago
It's trivial to go OOM on a modern dev machine with -j$(nproc) these days because of parallel link jobs. Make is never the bottleneck, it's just the trigger.
holsta · 9h ago
> A couple make flags that are useful [..]

But not portable. Please don't use them outside of your own non-distributable toy projects.

deng · 8h ago
I will not restrict myself to an arcane subset of Make just because you refuse to type 'gmake' instead of 'make'. Parallel execution, pattern rules, order-only prerequisites, includes, not to mention the dozens of useful function like (not)dir, (pat)subst, info... There's a reason why most POSIX Makefiles nowadays are generated. It's not GNU's fault that POSIX is stale.

EDIT: There's one exception, and that would be using Guile as an extension language, as that is often not available. However, thanks to conditionals (also not in POSIX, of course), it can be used optionally. I once sped up a Windows build by an order of magnitude by implementing certain things in Guile instead of calling shell (which is notoriously slow on Windows).

Tor3 · 8h ago
Agreed. My company decided on using GNU Make on every platform we supported, which back then (last century) was a bunch of Unix variants, and Linux. That made it possible to write a simple and portable build system which could be used for everything we did, no hassle. And not difficult, because gmake was available basically everywhere, then just as now.
matheusmoreira · 8h ago
Completely agree. POSIX is irrelevant anyway. Every single unixlike has unique features that are vastly superior to whatever legacy happens to be standardized by POSIX. Avoiding their use leads to nothing but misery.
stabbles · 8h ago
The guide is basically about GNU Make, and the flags are obviously just for end users to invoke make.
f1shy · 8h ago
Not every project has to be a multi-platform, multi-os, multi-language monster. It is perfectly fine to target a specific set of architecture, os, etc. And I find insulting and silly calling it a “toy project”
nrclark · 8h ago
Agreed if you're looking at it through the lens of portable software that you plan to distribute. Automake generates portable Makefiles for a reason.

But there's another huge category: people who are automating something that's not open-source. Maybe it stays within the walls of their company, where it's totally fine to say "build machines will always be Ubuntu" or whatever other environment their company prefers.

GNU Make has a ton of powerful features, and it makes sense to take advantage of them if you know that GNU Make will always be the one you use.

matheusmoreira · 8h ago
Portability is overrated. Better to make full use of one's tools. Restricting oneself to some "portable" subset of all features is pure masochism.

GNU Make is feature rich and is itself portable. It's also free software, as in freedom. Just use it.

Tor3 · 7h ago
And it's available everywhere. All Unix platforms had it back then, and the still existing ones (AIX is alive, at least) have it available. Which made it easy for our company to base our build system on GNU Make for everything, back in the day.
f1shy · 8h ago
Not only overrated, but also the source of extreme complex and gigantic pieces of software, which end being a nightmare to keep updated.

Just like optimization, it has its place and time.

wahern · 7h ago
People are too quick to [ab]use GNU Make features. IME, learning how to make do with portable make constructs can help discipline oneself to avoid excessive complexity, especially when it comes to macro definitions where GNU Make's Lispy looping and eval constructs are heavily overused and quickly lead to obtuse, impenetrable code. POSIX pattern substitutions are quite powerful and often produce easier to read code than the GNU equivalent. I'm not sure if computed variable names/nested variable references are well-defined in POSIX (e.g. "$($(FOO))"), but they are widely supported nonetheless, and often more readable than $(eval ...). (They can also be used for portable conditional constructs; I wouldn't argue they're more readable, though I often find them so.)

Some GNU Make constructs, like pattern rules, are indispensable in all but the simplest projects, but can also be overused.

For some reason there's a strong urge to programmatically generate build rules. But like with SQL queries, going beyond the parameterization already built into the language can be counter productive. A good Makefile, like a good SQL query, should be easy to read on its face. Yes, it often means greater verbosity and even repetition, but that can be a benefit to be embraced (at least embraced more than is instinctively common).

EDIT: Computed variable references are well-defined as of POSIX-2024, including (AFAICT) on the left-hand side of a definition. In the discussion it was shown the semantics were already supported by all extant implementations.

f1shy · 7h ago
Absolutely. But the target has nothing to do with “portability” but a more fundamental and important principle(s) of readability and maintainability.
wahern · 6h ago
It's a matter of praxis. Targeting portable constructs is (IMO) a useful methodology for achieving the abstract goal. It doesn't have to be strict, but it provides a quantifiable, objective metric (i.e. amount of non-portable constructs employed) to help achieve an otherwise subjective goal.

Otherwise you face an ocean of choices that can be overwhelming, especially if you're not very experienced in the problem space. It's like the common refrain with C++: most developers settle on a subset of C++ to minimize code complexity; but which subset? (They can vary widely, across projects and time.) In the case of Make, you can just pick the POSIX and/or de facto portable subset as your target, avoiding alot of choice paralysis/anxiety (though you still face it when deciding when to break out of that box to leverage GNU extensions).

immibis · 2h ago
Unless you are actually targeting all of those platforms, of course. Which you're not.
signa11 · 6h ago
exactly ! instead of writing portable Makefiles, use portable make !
holsta · 3h ago
> Portability is overrated. > GNU Make is [..] itself portable.

Sounds like it's not overrated, then. You just prefer that other people write portable C and package GNU Make for all systems instead of you writing POSIX Make.

leetrout · 6h ago
The article says most people don’t mark recipes as .PHONY and seems to use that as a reason to not bother in the tutorial. I think that is a weak excuse and we should teach the right way to use a tool.

My teammates gave me a hard time for adding and maintaining .PHONY on all our recipes since we use make as a task runner.

Clark Grubb has a great page explaining a style guide for make files:

https://clarkgrubb.com/makefile-style-guide

Does anyone else use this style guide? Or for phony recipes marking phony at the recipe declaration vs a giant list at the top of the file?

I would love to have a linter that enforced this…

nrclark · 1h ago
I just gave that a read. Good doc overall. There are a few items I disagree with:

- Cargo-culted use of -o pipefail. Pipefail has its uses, but it breaks one of the most common things people do in a pipeline: filter the output with grep. Add it on a per-recipe basis instead.

- Marking non-file targets as .PHONY. This is strictly correct, but it's usually not necessary. I think it adds unneeded verbosity to the Makefile, especially if you have a lot of targets. Better to add it on an as-needed basis IMO.

- Recipes with multiple output files. Use of dummyfiles/flagfiles used to be the standard if a pattern-rule wasn't the right fit. But as of GNU Make 4.3 (shipping in Ubuntu 22.04 LTS), there is native support for grouped targets. Check it out here: https://www.gnu.org/software/make/manual/html_node/Multiple-...

danw1979 · 7h ago
Make has its place as a build tool for large C codebases.

People sometimes treat it as a generic “project specific job runner”, which it’s not a good fit for. Even simple conditionals are difficult.

I’ve seen several well-intentioned attempts at wrapping Terraform with it, for example, which have ended terribly.

monkeyelite · 4h ago
It’s not a generic job runner. It’s a generic way to transform linear shell scripts into declarative dependencies. It’s a general tool for the shell.
creata · 6h ago
Is there a good generic job runner?

Edit: Sorry, it looks like I totally misunderstood what you meant by "job runner".

homebrewer · 5h ago
Sure, a bash script.

People keep writing and using other alternatives (like just), which provide a very slight improvement on pure shell at the cost of installing yet another tool everywhere.

I stick with bash, write every task as a separate function, and multiplex between them with a case statement (which supports globs et al. and is very readable).

EPWN3D · 4h ago
Years ago, I discovered git-rev-parse's option parsing, and it completely removed any excuse I had not to write my own personal bash scripts to a professional standard.

Now when I need a tool, I can knock it out in bash with proper option parsing, usage, etc.

bash is awful on a lot of fronts, but if you're writing code that's primarily calling a bunch of tools and mucking with their output, it's still the best thing out there I've found just due to piping syntax.

dakom · 5h ago
Taskfile and Justfile are pretty solid.
o11c · 2h ago
There are some dangerous and subtle problems with this tutorial.

In order to handle long options and empty short options, when searching MAKEFLAGS you really need to do:

  ifneq (,$(findstring t,$(firstword -$(MAKEFLAGS))))
If you need compatibility with the ancient version of `make` shipped by default on OS X, not that it is missing a lot of functions and features, some subtly.

Most other problems are either obvious typos or due to violating best-practice style, so I won't go into them.

Note also that `load` is actually more portable than `guile`, but make sure you use the correct compiler in case you're cross-compiling the main project.

***

Really you should just read Paul's Rules of Makefiles [1], then the GNU make manual [2]. Some of the other manuals related to GNU build tools are also relevant even if you aren't using the particular tool. I also have a demo project [3] if you just want something that works and isn't complicated.

[1]: https://make.mad-scientist.net/papers/rules-of-makefiles/

[2]: https://www.gnu.org/software/make/manual/

[3]: https://github.com/o11c/makefile-demo

llukas · 8h ago
This is excellent modern replacement for part where Makefiles get messy: https://github.com/casey/just
thristian · 7h ago
It replaces the "list of short shell-scripts" aspect of Make, but it doesn't replace the "only execute rules that need to be re-executed" part, which is the actually useful bit.
amelius · 6h ago
Sounds good. If it isn't broken, don't fix it.
ajross · 5h ago
This is the most frustrating bit of this weird recursive ecosystem of build tools. No one really uses all of make, so they only clone the bits they need, so their tool is simple and clean and beautiful to a subset of the community that has their same problem. But it can't replace make, so seven months later someone with a slightly different problem shows up with a make replacement, and the circle of life continues.

And you see this on the other side of the problem area too, where large and ugly tools like cmake are trying to do what older large and ugly software like autotools did, and trying to replace make. And they suck too.

I continue to believe the GNU make in the late 80's was and remains a better generic tool than everything in the modern world in all ways but syntax (and in many cases, again c.f. cmake, it had better syntax too). Had the original v7 syntax used something other than tabs, and understood that variable names longer than 1 byte were a good thing, we might never have found ourselves in this mess.

PhilippGille · 7h ago
Or:

- Task (Go): https://github.com/go-task/task

- Cake (C#): https://github.com/cake-build/cake

- Rake (Ruby): https://github.com/ruby/rake

Or an entirely different concept: Makedown, as discussed on HN 8 months ago: https://news.ycombinator.com/item?id=41825344

fuzztester · 4h ago
izabera · 7h ago
they do place themselves as an alternative to make, but imho they're entirely different and not at all comparable. make is centered around creating artefacts and not rebuilding what is already built. just is a command runner.
izoow · 7h ago
The main benefit I see with using Make as a command runner is that it's a standard tool that's installed "everywhere". Even though these replacements seem nicer to use, I never felt like they bring enough to the table to warrant having to install an extra tool.
Lyngbakr · 7h ago
Task* is another alternative, although I admittedly only use it with simple hobby projects in C so I can't speak to whether it scales well or not.

*https://taskfile.dev/

syklemil · 5h ago
I also use just as a command runner, but I gotta agree with the others here that it should be described accurately as a command runner, while make is a build system.

There are some uses of make, especially by people who have never used it to build C/C++ projects, which makes more sense to replace with just. It doesn't have the baggage that make does, and they're not using it to actually make files. They also quite likely don't know the conventions (e.g. what a lot of us expect "make install" to do), and I support them in not learning the conventions of make—as long as they use something else. :)

Other uses of make will need other modern replacements, e.g. Cmake or Bazel.

It is possible that Kids These Days can say "no thanks" when someone tries to teach them make, and that the future of make is more along the lines of something us greybeards complain about. Back in _my_ day, etc.

stabbles · 9h ago
Another thing that's interesting lately is that CMake has decided that Makefiles are unfit for projects that use C++20 modules, and ninja is the way to go. [1]

Basically it's considered too hard if not impossible to statically define the target's dependencies. This is now done dynamically with tools like `clang-scan-deps` [2]

[1] https://cmake.org/cmake/help/latest/manual/cmake-cxxmodules....

[2] https://llvm.org/devmtg/2019-04/slides/TechTalk-Lorenz-clang...

nrclark · 8h ago
Modules are a disaster tbh.
dgan · 8h ago
can you expand on that?
nrclark · 3h ago
Yes. They were approved for C++20 with no working reference implementation. This was done over objections from representatives from every compiler and build system. 5 years later they're still not widely (or fully) implemented.

They're impossible to implement in Make, which is without exaggeration the world's most widely-used build tool. Even CMake has had a very difficult time implementing them. They break most methods for incremental builds, and mean that a compiler is needed just to determine staleness. They also make fully parallelized compilation impossible, because dependencies can't be fully resolved by the build system.

alextingle · 7h ago
If you can't easily reason about dependencies, then your builds will just get more and more bloated.

People who care about build systems are a special kind of nerd. Programmers are often blissfully ignorant of what it takes to build large projects - their experience is based around building toy projects, which is so easy it doesn't really matter what you do.

In my experience, once a project has reached a certain size, you need to lay down simple rules that programmers can understand and follow, to help them from exploding the build times. Modules make that extra hard.

skydhash · 7h ago
These days I like system paths for deps more and more. You just need to specify the paths and everything that is in there can be included in the project. But gradle shenanigans where the dependency graph is built by some obscure logic is not to my liking.
monkeyelite · 4h ago
Nobody implemented them except Msft
andreynering · 6h ago
I'm the creator and one of the maintainers of an alternative to Make: Task.

It has existed for 8+ years and still evolving. Give it a try if you're looking for something fresh, and don't hesitate to ask any questions.

https://taskfile.dev/

https://github.com/go-task/task

ulbu · 6h ago
and another alternative: just

https://github.com/casey/just

TickleSteve · 6h ago
Stop with the alternatives... just use make for this task.

Seriously. :o)

dakom · 5h ago
Funny coincidence, I use this often and just opened an issue earlier today: https://github.com/go-task/task/issues/2303 :)
andreynering · 2h ago
I just responded.

And thanks for your support!

hahn-kev · 5h ago
Thank you for making it! We love it
amelius · 6h ago
Does anyone have experience with tup?

https://gittup.org/tup/ex_dependencies.html

It is a build system that automatically determines dependencies based on file system access, so it can work with any kind of compiler/tool.

articsputnik · 6h ago
I love my Makefiles. All my github repos contain a Makefile as I always forget the commands for every repo. Like this I have it stored nicely, but I can also add complex steps and run on each of my projects `make` and it will do what I'd expect without remembering any cmds.
jekwoooooe · 5h ago
In 2025 makefiles are once again only for C projects at best. For task running, use just or mise.
pards · 7h ago
> Note: Makefiles must be indented using TABs and not spaces or make will fail.

Oh no. I have never worked with Makefiles but I bet that causes pain and suffering.

I've lost so many hours to missing/extraneous spaces in YAML files that my team recently agreed to get rid of YAML from our Spring Boot codebase.

f1shy · 7h ago
A no-issue if you use a half decent editor, same as python is not a problem today.
PhilipRoman · 5h ago
Do people not have different symbols for spaces/tabs in IDEs? I see people committing mixed or trailing whitespace but every editor I've used shows spaces and tabs clearly.

Agree about yaml though. I still have to look up how to align a multiline string every single time.

ho_schi · 7h ago
Aside from the editor thing. I indent usually with Tab :)

I learned Makefiles a bit, using it in one tiny project. Than checked Autotools and everything in my brain refused to learn this awkward workaround-engine. At the same time Meson[1] appeared and the thing with Builds, Dependencies and Testing is solved :)

[1] https://mesonbuild.com

PS: Dependency handling with Meson is awesome.

izabera · 7h ago
this is literally never an issue because every editor automatically uses tabs for makefiles
thesnide · 6h ago
actually requiring tabs is a godsend. no more off-by-one-space error
bitwize · 7h ago
An editor that groks Makefiles will help immensely, as it will ensure that the TAB key does the right thing. Emacs is good at this.

Of course the real solution is: just use CMake, you dweeb.

codelikeawolf · 3h ago
I was a little surprised by this bullet point for when make would be an appropriate build tool:

> The build system does not need to be highly portable.

I know "highly" is a vague qualifier here, but I pretty much always default to a Makefile in Go projects and have used it to build Electron apps on Linux, macOS, and Windows (without WSL, just Make for Windows). You have to do a little extra finagling to get the executable paths right, but it works well enough for my purposes.

To some extent, I get why Make gets a lot of hate. But if you keep them simple, they provide a great way to get around some of the limitations of package.json scripts (e.g., adding comments).

Joker_vD · 6h ago
You know, for small-ish C projects I found that the easiest way to handle "which .h files do the .c files depend on" question is to just say "on all of them".

    SOURCE_FILES := $(wildcard $(SRC_DIR)/*.c)
    HEADER_FILES := $(wildcard $(SRC_DIR)/*.h)

    OBJ_FILES := $(patsubst $(SRC_DIR)/%.c,$(BUILD_DIR)/%.o,$(SOURCE_FILES))

    .PHONY: build clean

    build: $(BUILD_DIR)/$(TARGET)

    clean:
        rm -rf $(BUILD_DIR)

    $(BUILD_DIR):
        mkdir $(BUILD_DIR)

    $(BUILD_DIR)/$(TARGET): $(OBJ_FILES) | $(BUILD_DIR)
        $(LINK.o) $^ $(LDLIBS) -o $@

    $(BUILD_DIR)/%.o: $(SRC_DIR)/%.c $(HEADER_FILES) | $(BUILD_DIR)
        $(COMPILE.c) $< -o $@
So when you don't fiddle with inter-file/shared interfaces, you get an incremental rebuild. When you do — you get a full rebuild. Not ideal, but mostly fine, in my experience.

P.S. I just love the way that Make names its built-in variables. The output is obviously $@, but can you quickly tell which of $^ and $< give you only the first of the inputs? What about $> and $∨, do you remember what they do?

zwp · 6h ago
I used to like having a "depend" target to make the dependencies explicit and so minimize build time, although that fiddles with the contents of the Makefile (some discussion at https://wiki.c2.com/?MakeDepend).

The standalone makedepend(1) that does the work is available in package xutils-dev on Ubuntu.

EPWN3D · 4h ago
You can have gcc and clang output dependency files that your Makefile can include. Those are targets which will tell you which headers a source file depends on.
Joker_vD · 4h ago
I know I can do that. But it's fiddly, and doesn't really save time for small to medium projects: on small projects, full recompilation is fast enough that the time to regularly re-run "gcc -MMD" is actually noticeable and wasted — it simply is faster to not bother with it. And for medium projects, in my experience, the headers tend to not change all that often, and when they do it means you need to rebuild about 30-50% of all the sources so might as well rebuild 100% just to be on the safe side. I've had enough pitiful debugging experiences where the executable code does not match the source files that are fixed by doing "make clean build".

And when you change flags/compiler versions/system header versions you still need to a clean rebuild, so unless you write your makefiles the way that e.g. CMake generates them (I am willing to bet nobody does that)...

o11c · 2h ago
If you do lazy `-include *.d` there no time wasted for `-MMD` since it only runs as part of your main compile.

The only "gotcha" is if you delete a header but you forgot to give GCC the option that says "I might delete a header in future".

Joker_vD · 1h ago
Oooh, there is an "-MP" flag, which does... something? Anyway, it's all just a gigantic hack at this point: if we defer to gcc to figure out the accurate dependencies anyway, why do we even use make in the first place? I mean, I know the answer ("that's how it came to be, historically"), it just rubs me the wrong way since gcc, strictly speaking, is already a compiler driver that orchestrates running the internal utilities so it may as well just bloody learn how to run them all in one swell foop or whatever.
circadian · 5h ago
Very glad to see a tutorial like this. Make is something I've used relentlessly because it just works, but I know I'm missing out a lot more that it can help with because of my feeling that the docs are inaccessible. Knowing that this is here waiting for the day when a project calls for something just a little more means I won't bloat out my development workflow. Something a little more friendly than the make docs themselves lowers the barrier for me, nice one! :)
zabzonk · 5h ago
> the docs are inaccessible

The GNU make documentation is excellent - some of the best technical writing I've come across.

buserror · 9h ago
Very nice article, seems to mention all the modern bits that helps making makefile so, SO much easier than in decades past...

The interesting bits are for example the -MMD flag to gcc, which outputs a .d file you can -include ${wildcard *.d} and you get free, up to date dependencies for your headers etc.

That and 'vpath' to tell it where to find the source files for % rules, and really, all the hard work is done and your 1/2 page Makefile will stay the same 'forever' and wills still work in 20 years...

donatj · 7h ago
Generally speaking, it would be nice if the examples had simple execution examples like

    $ make foo
    Hello foo
    This ran too!
That's a contrived example, but some of these take a bit too much thought parsing the example Makefile alone to understand the execution order and rule selection.

It would just be very helpful to have clear examples of when I run this, I get this.

globular-toast · 7h ago
Make is one of those things that I'm really glad I learnt at the beginning of my career. Not because I use it much any more, but because it showed me the power of a declarative system over an imperative one.

I also realised at one point how naturally the idea extends to other tasks that I do. Going by the picture at the top of this site, it seems the author realised a similar thing to me: you can understand food recipes better if you think about them declaratively like makefiles, rather than imperatively like scripts, which is how recipes are traditionally written down.

I wrote about it here: https://blog.gpkb.org/posts/cooking-with-make/

I always scribble down recipes in a way that I can read like a Makefile and take that into the kitchen with me. I'm curious if anyone has tried typesetting or displaying recipes in this way as I feel like it would save a lot of time when reading new recipes as I wouldn't have to convert from a script to a makefile myself.

wrasee · 7h ago
A nice thing about this approach is that it passes more control to the user who is essentially now responsible for resolving the dependency graph themselves and “be” the executor. Taking your cooking example, the declarative nature better exposes where there are open choices in what to do next, which affords the user more freedom to take into account other externalities and constraints not formally specified in the makefile (like specific orderings that make washing up easier).

Of course the tradeoff is that you have to resolve the dependency graph yourself. That’s more work on you when you just want a set of pre-serialised, sequential steps to follow.

torcete · 5h ago
In bioinformatics I used to use snakemake or perhaps nextflow. I wonder if makefiles could be use with the same effectiveness.
uncircle · 7h ago
> it's specifically written for GNU Make, which is the standard implementation on Linux and MacOS

Is this true? Doesn't macOS ship with BSD make?

oneeyedpigeon · 7h ago
From macOS 15.5:

    [~/home] $ which make
    /usr/bin/make

    [~/home] $ make --version
    GNU Make 3.81
    Copyright (C) 2006  Free Software Foundation, Inc.
    This is free software; see the source for copying conditions.
    There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
    PARTICULAR PURPOSE.

    This program built for i386-apple-darwin11.3.0
arccy · 6h ago
It may be gnu make, but it's 20 years old: https://lists.gnu.org/archive/html/info-gnu/2006-04/msg00000...

so for all intents and purposes, it's a different make than what most people think about when they say gnu make.

    $ gmake --version
    GNU Make 4.4.1
    Built for aarch64-apple-darwin24.0.0
    Copyright (C) 1988-2023 Free Software Foundation, Inc.
    License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>
    This is free software: you are free to change and redistribute it.
    There is NO WARRANTY, to the extent permitted by law.
syklemil · 5h ago
This seems similar to how macos has bash, but it has ancient bash. Allegedly the reason is that Apple is fine with GPL2 but not GPL3.

Though at that point I kinda wonder why they bother shipping bash at all, when their default shell is zsh, and it's entirely possible to have system shell scripts run by the BSD-licensed dash, rather than bash.

There's probably also something to be said about the choice of running ancient GNU make rather than BSD make, but I don't know enough about the differences to say it.

EPWN3D · 4h ago
Personally I like being able to rely on Apple's software supply chain for these tools since they're so fundamental. They're sitting there on the signed system volume and cannot be tampered with, and they haven't been changed in a decade or more.
donatj · 8h ago
I have been working with Makefiles for over a decade, though never with C nor C++

I knew there was a lot of weirdness and baggage but I am frankly kind of horrified to learn about these "implicit rules" that seemingly automatically activate the C compiler due to the mere presence of a rule that ends in ".c" or ".o"

smidgeon · 7h ago
Implicit rules are our friends, in life as in make.
alextingle · 7h ago
A serious makefile will disable all the default rules by defining the empty rule ...

.SUFFIXES:

matheusmoreira · 9h ago
Makefiles are great but do try not to get carried away. Years ago I tried to create a pure GNU Make framework, only to realize I was effectively reinventing autoconf. That was the moment I finally understood what the GNU autotools had been made for.

Makefiles are eerily lisplike turing tarpits. GNU Make even has metaprogramming capabilities. Resisting the urge to metaprogram some unholy system inside the makefile can be difficult. The ubiquitousness of GNU Make makes it quite tempting.

signa11 · 9h ago
why single meson out ? infinitely better than most alternatives mentioned in the article.
kjgkjhfkjf · 8h ago
Why would you use make for a C or C++ project when bazel exists?
taminka · 7h ago
because i don't want to learn another C build system and i actually have useful stuff to get done
elteto · 7h ago
Cross-compilation and using multiple toolchains was/is a nightmare in bazel (at least it was until a couple of years ago).

Not saying make is strictly better here but at least you can find plenty of examples and documentation on it.

akazantsev · 7h ago
Because I "love" to install Java runtime to build my C/C++ applications.
taminka · 5h ago
jesus christ it's written in fucking java of all things?
johnisgood · 2h ago
Must be a joke, right?

If not Makefile, then use ninja (and/or meson).

Toritori12 · 5h ago
I am just happy with xmake for my small personal project (about 1-2K LOC). Really glad I migrated away from cmake.
claytonaalves · 7h ago
I work with Makefiles on Delphi/FreePascal projects.
akoboldfrying · 6h ago
Looks beautiful, but I got as far as the first dependency graph illustration (yellow and light green) and noticed that the final target is a source file, main.cpp, when it should almost certainly be a binary ("main.exe" or simply "main"). Similarly, one.cpp does not depend on one.h in the way that make cares about.

Make cares only about how to generate files from other files.

I point this out because this is one of the classic misunderstandings about dependencies that beginners (and sometimes old hands) make. The code inside main.cpp might well depend on code in one.cpp to work, but you don't generate main.cpp from one.cpp et al. You generate the final binary from those other files, and that's what make cares about.

One right way to show it would be to have one.o depend on both one.cpp and on one.h (yes, this is confusing at first), likewise for two.o, and main.exe depend on one.o, two.o, libc and libm. Another way would be to omit the object files completely (as now), and just have main.exe depend directly on all other targets, but this makes for a less helpful example.

ETA: I'd also appreciate it if you would mention in the "Recursive use of make" section that calling make recursively is a Bad Idea [0]. (Why? In short, because no dependency information can cross that process barrier, so there's always a risk that you don't build things in the right order, forget to build something you should, etc. If you have a hierarchy of projects in a subdir hierarchy, it's much better to use a separate "Makefile fragment" file in each subdir, and then "include" them all into a single top-level Makefile, so that make has a chance to ensure that, ya know, things get built before other things that need them.) I realise the GNU docs themselves don't say so, and GNU make has a ton of hacks to accommodate recursive make (suggesting that this is a blessed way), but that is simply unfortunate.

[0]: "Recursive Make Considered Harmful", https://accu.org/journals/overload/14/71/miller_2004/

90s_dev · 5h ago
I remember learning how simple and pure Makefiles were back in about 2010, and then running into so many bumps that a friend recommended CMake. But usually CMake just uses Make under the hood anyway. I really do like this idea of "build a small, simple, and robust layer, and if it's not good enough, build another layer on top of it." But I do wonder what life would be like if we started from scratch by merging all these layers. Would it be Cargo? Probably not, I doubt it's as flexible as all these layers combined.
immibis · 2h ago
CMake normally runs on top of Ninja, which is like the execution engine of Make with all the implicit rules and metaprogramming stripped out. (Not enough stripped out IMO)
revskill · 6h ago
Makefile is bad like python due to space sensitivity.
Joker_vD · 6h ago
Almost all modern-ish languages are sensitive to whitespace, you can't write e.g. "voidfunc(intx){return0;}" in C.

In fact, I don't think there is any other space-insensitive language except from early versions of FORTRAN.

pklausler · 5h ago
You can still write Fortran without spaces if you like. Or with spaces in the middle of tokens, too. It makes parsing fun.