So I have this Makefile based build system that my users feel is working too slowly. For the sake of this question lets define performance as the time it takes make to figure out what it should actually do.
I can see some avenues for optimization --
- Reducing the number of times Makefile is parsed and the DAG recalculated due to including a Makefile fragment.
- Reducing the number of going to an external Makefile with
make -C
- Reducing variable expansions
- etc.
-- however I want to know first where are my bottlenecks. Since optimization without profiling is a waste of life, I want to ask: How to profile a Makefile?
Assume that the system I inherited is fairly well designed, i.e. it already implements the most common tricks of the trade: (mostly) non recursive make, ccache, precompiled headers, auto generated header dependencies etc).
... and just to preempt some of the possible answer. I know that there might be faster and better build systems then GNU make - (Personally, I am eagerly waiting to see what the CMake folks will come up with regards to the Ninja system) - but unfortunately swapping build system is not in the cards.