Using Make to batch process files

Hello all,

I have a make question, and I was hoping somebody here might be able to point me in the right direction.

Here is my issue; I have a command-line tool that I use to run a conversion on an input XML file that results in an output binary file. However, this particular tool needs to load up an overarching project file when it starts, and the time involved in that initial parsing is significant. Therefore, if I have 100 XML files that need processing, I waste a lot of time starting the tool over and over.

I like to use make for this kind of processing, because the dependency handling is so good. However, in this instance I would like to send all of the XML files that need processing to the tool in a single batch. Typically my make rules look like this:

%.bin : %.xml
           tool -export %.xml

... but this of course will run "tool" once for each XML file encountered in a previous dependency. I had considered this:

$(ALL_BIN) : $(ALL_XML)
                 tool -export $(ALL_XML)

... which will batch up the job, but it will rebuild every binary file if just a single XML file changes. Does anybody know a way that I keep make's dependency checking on a per-file basis, but run the actual command only once? Thanks much for any advice!

Working in multiple jobs is generally preferred -- this lets make split the work across multiple cores, meaning, lets it do more at once. If you actually have multiple cores, you may find it faster to let it continue this way and run make -j2, or -j4 etc.

On the other hand there is a macro for "dependencies more recent than the target": $? So this could do it:

$(ALL_BIN) : $(ALL_XML)
        tool -export $?

Note that the eight spaces in front of tool there must actually be a tab for this rule to work.