Advice for managing compile dependencies and incremental tasks

I’m wondering about the best approach for managing a custom compile step with files that have transitive dependencies if want my compile task to be incremental. Here’s the specific use case: I have a directory full of templates (handlebars templates in this case). Lets say these are in a pages directory. Some of these templates include other templates (Handlebars partials). Lets say the included templates are in includes. Compiling all the templates is fairly easy. For example, I could use the handlebars command like handlebars <input-file> to compile every file in pages (e.g. for (File file : srcDir) { project.exec { commandLine 'handlebars', file.name }}.

When called the handlebars command compiles input-file and, if necessary, pulls up any of the templates input-file includes and any of the includes those includes add, etc. Without going into unnecessary details I can also learn the full set of transitive dependencies for a template when I compile it. So, for example, if template A includes B and C, and B includes E in the process of compiling A I will learn that if B, C, or E changes I need to recompile A. Note that I pretty much need to compile the template to learn this information as I need to parse it and determine all the places it has an include, how those includes resolve, etc.

I’d like to create a custom, incremental task that compiles files, but only when necessary. I know how to declare the inputs to this task and I know how to maintain, in memory, a map from files in the includes directory to the templates in pages that depend on it directly or transitively. Thus, if the the IncrementalTaskInputs.outOfDate includes, for example, includes/E I know that I need to re-compile A. So far so good, but I’m unclear about the best way to make this actually work.

Clearly storing the dependencies only in memory won’t work as the deamon could restart or might not even be running (and I’m not entirely clear on which objects survive in memory in a deamon between runs anyway). As I understand it, Gradle maintains some kind of cache so it would be able to compute the proper IncrementalTaskInputs and know that, for example, only includes/E changed since it was last run. So I somehow need to maintain a cache of dependencies that I can read off the disk when necessary. I can do that manually by just write them to a file, but that seems error-prone. For example, an ill-timed Ctrl-C could have my cache out of date with Gradle’s cache. I’m guessing there’s a built-in system that allows me to simply declare the dependencies and have Gradle take care of keeping this along side it’s own cache. Or better yet, maybe there’s an existing base class that handles this kind of thing and all I have to do is declare what the dependencies are? Are there such things?

There is no such ready-to-reuse mechanism yet. We have incremental compilation for Java and the idea of opening that up further has come up internally. But there is no concrete plan yet that I know of.

As long as the buildscript classpath does not change, classes survive between Gradle invocations. Instances (like the project) are recreated on each build, though. So as a first shot you could store things in static variables. If that state is missing, you do a full build. That would already help a lot for anyone using the daemon with your plugin.

Thanks @st_oehme. I cross-posted this to StackOverflow:

If you want the karma points, feel free to post your answer there. If not, I’ll copy and paste it there (with proper attribution).