How does a composite build ensure that one included build is built before another?

How does a composite build ensure that one included build is built before another?

e.g., if included build b depends on coordinates that match included build a, how does Gradle ensure that a is built before b?

I’m trying to diagnose and fix a problem in the Chainsaw plugin where a & b seem to be built in parallel, instead of a being built before b. This results in a’s output not being available on the module path for b, so b fails to build.

There’s nothing special there, it just turns external dependencies into project dependencies. If that plugin doesn’t work with composites there’s a high likelyhood it also won’t work with a normal multi-project build. It may have some undeclared dependencies.

Edit: Yep, it seems to completely erase dependency information by turning dependencies into Strings, which Gradle won’t understand. Also it resolves all dependencies at configuration time, which is wrong and will slow down builds and lead to ordering conflicts.

The maintainer should have a look at CommandLineArgumentProvider, which allows you to derive command line arguments from rich objects with proper input annotations. Specifically, the original classpath and the chainsaw plugin’s patching configuration need to be inputs to the task for it to be correct.

1 Like

Not sure if this is what you are looking for, but try something like this:

myTask.dependsOn gradle.includedBuild('build-name').task(":some-task-name")

Some full example can be found here: https://github.com/daggerok/spring-boot-rest-jms-activemq/blob/master/activemq-spring-stomp/

settings.gradle

rootProject.name = "spring-jms-docker-activemq"
includeBuild "messaging-backend"
includeBuild "messaging-frontend-client"
includeBuild "messaging-frontend-admin"
includeBuild "docker"

build.gradle

["clean", "assemble", "test", "build"].each { taskName ->
  tasks.create(taskName) { task ->
    gradle.includedBuilds.each { build ->
      if (build.name == "docker") return
      dependsOn gradle.includedBuild(build.name).task(":$taskName")
    }
  }
}

["dockerUp", "dockerDown"].each { taskName ->
  gradle.includedBuilds.findAll({ it.name == "docker" }).each { build ->
    tasks.create(taskName) { task ->
      dependsOn gradle.includedBuild(build.name).task(":$taskName")
    }
  }
}

// bootRun -> run-$projectName
gradle.includedBuilds.each { build ->
  tasks.create("run-$build.name") { task ->
    dependsOn assemble, gradle.includedBuild(build.name).task(":bootRun")
    shouldRunAfter clean, assemble
  }
}

Regards,
Maksim Kostromin

1 Like

@daggerok Thanks for the suggestion.

I prefer to let the existing Grade composite build dependency substitution mechanism work as per usual, rather than manually adding inter-build task dependencies.

Instead of using Chainsaw, I started using the org.gradle.java.experimental-jigsaw plugin from which Chainsaw was forked.

I modified that original plugin in my own fork, which avoids the problems introduced in Chainsaw.

It’s possible that the original experimental-jigsaw plugin wasn’t optimally setup, or that it doesn’t use newer Gradle API features that would be handy (it used Gradle 4.1-rc-1), or that I introduced some new issues in my modifications.

@daggerok & @st_oehme: if either of you (or anybody else) has suggestions about how to improve my fork, I’d be happy for any help.

Also, if you want, I can make more PRs from my fork to the original repo; no one replied to my earlier PRs, so I’m waiting for comments on those before I create any more PRs.

Thanks again.

@rgoldberg Which branch should I be giving feedback on? Also, do you plan to contribute those enhancements back to Chainsaw? It would be a pity if we ended up with two plugins :slight_smile:

@st_oehme Thanks for the help.

I agree that we should try to minimize the number of extant Java module plugins. I think the best solution is to perform PRs from my fork to the original repo, because:

  1. Chainsaw introduced dependency bugs (and possibly others)
  2. Chainsaw completely reorganized the code, renaming the plugin, package & classes, plus adding many additional classes that might not be optimally structured to work properly with Gradle
  3. Over two weeks ago, I opened some issues on Chainsaw, and even provided a link to this thread in one of them, but haven’t heard back from the Chainsaw maintainer, so it might be difficult to merge PRs into Chainsaw, or to get answers to questions about Chainsaw
  4. There’s no need for anyone to use my fork directly, as long as we can integrate PRs from my fork into the original repo in a timely fashion

You currently should look at the tip of the code-cleanup-3 branch of my fork, which is currently the newest commit in my repo (outside the renamed-plugin branch, which you should ignore; I had to rename the plugin id to deploy it as a binary to the Gradle Plugin Portal, which was necessary to avoid a composite build failure when I included my modified version of org.gradle.java.experimental-jigsaw and used dependency substitution to apply my version to another included build that contains code for a Gradle plugin that is in turn used by other included builds in the same composite).

Should I move all my commits (outside renamed-plugin) to master? Or will some other git reorganization make things easier for you?

Should I add to the README.md a list of features added in my fork (besides minor code cleanup)? Or should I add a CHANGELOG.md (or some other file), or create Issues in my fork and link commits to them, or…? (I normally use hg without PRs, so I don’t know best practices for git and PRs).

So far, my fork’s new features are:

  1. allow compiling a module-info.java that exports a package that has no Java code, but that has code from another language (e.g., Kotlin); this is supported by using --patch-module for other source set output directories in the same build instead of including those directories in --module-path
  2. parse the module name from module-info.java instead of requiring it to be duplicated in javaModule.name in build.gradle(.kts) (only main/java/module-info.java is read; please let me know if I must support other locations)
  3. append javac module arguments to existing arguments instead of replacing existing arguments
  4. only add javac module arguments that include a path if that path is non-empty
  5. general code cleanup, including using Gradle 4.9 Kotlin DSL (instead of 4.1 RC 1 Groovy DSL), updating dependency versions, using lambdas, etc.

Should I also add to the README.md (or some other file) a list of features that I’ve noticed implemented in Chainsaw, that aren’t yet implemented in my fork?

Chainsaw features that I haven’t yet implemented include:

  1. supporting test frameworks other than JUnit 4
  2. optionally verifying module name standards compliance (module name = common package ancestor name)
  3. supporting javadoc
  4. allowing source sets to not have module-info.java
  5. manually configuring various javac module arguments (e.g., --add-reads)

@st_oehme I updated my previous post. Most important changes are:

  1. look at branch code-cleanup-3
  2. the renamed-plugin branch is now a side branch with no descendants, so we should be able to easily perform PRs from my fork to the original
  3. I fixed the last place where I needed to retain existing JVM args
  4. noted that module name is only extracted from main/java/module-info.java; please inform me if I must support other locations
  5. cleaned up some more code in my fork

I wrote the original post below before I realized that Task#getInputs() & Task#getOutputs() seems to provide a simpler version of what I proposed. When @st_oehme mentioned in his previous post that the

patching configuration need to be inputs to the task for it to be correct

I thought that he just used the term to loosely describe JavaCompile#getClasspath(); now that I know about Task#getInputs(), I will investigate its proper usage. I imagine that dependencies, dependsOn, etc. add to the inputs, and the outputs of each task are defined by the type of task and how it’s configured. I assume that any input I to task T requires that all other tasks whose outputs contain I must be executed before T can be execute. I imagine that ynputs & outputs only specify what must be available for a task to consume, and what the task produces, respectively, but doesn’t specify how the inputs are consumed, or how the outputs are produced. I imagine that the manner in which an input is consumed is a task-specifc; so, e.g., AbstractCompile#getClasspath() indicates that an input is consumed as a class-path component. I imagine that removing a component from a task’s class-path also removed it from the task’s inputs, but I haven’t verified this. (I suspect this, though, because tasks ran out of order when using Chainsaw, and I noticed that Chainsaw clears the class-path, but didn’t notice Chainsaw removing or modifying any inputs or outputs, besides adding one text property input that was also performed by jigsaw-experimental, the latter of which didn’t suffer from the same task ordering issues).

Is the above correct?

My proposal below would track the consumption methodology in a standardized manner for all inputs, rather than leaving it ad hoc. It probably could be added in the background using facades to still present the existing input, output, class-path, and other interfaces, but it would have to support unknown consumption methodologies until you were comfortable that all Gradle code and enough plugins where refactored to use the complete new API to allow breaking backwards compatibility.

==============================================================================

From a higher perspective, I imagine that the class-path dependencies are moved to the module-path (or patch-modules) in Task#doFirst() actions because removing the class-path dependencies in the configuration phase would somehow ruin the dependencies, and there’s no existing place to put module-path or patch-module dependencies that preserve the dependencies for the rest of Gradle.

I imagine the nicest way to support modules would be to have the initial dependencies setup be able to specify a type of dependency (class-path, module-path, patch-module, etc.), so that dependencies would be in the appropriate location from the start, instead of having to transfer them from the class-path to the correct location at a later time. That, however, is well outside my knowledge of Gradle. If this wouldn’t require too much refactoring in Gradle, I’d be happy to help revamp Gradle core so that it could support modules this way, then add cleaner module support on top of the refactoring.

I’d imagine that this might be implemented by having an interface named something like Dependency which would have implementations like ClassPathDependency, ModulePathDependency, PatchModuleDependency, etc.

A new dependenciesByType property would be added to AbstractCompile, which would be a Multimap<? extends Class<T extends Dependency>, T>, i.e., it would map from a Dependency sub-type to all dependencies of that sub-type for this. So, e.g., getDependenciesByType().get(ClassPathDependency.class) would return all the class-path dependencies.

JavaCompile#classpath would initially be deprecated and backed by getDependenciesByType().get(ClassPathDependency.class)) (or by getDependenciesByType().values(), depending on if existing code must see all dependencies, or just class-path ones), and eventually would be removed.

if anything like this proposed solution were ever to be implemented, it would likely break some third-party plugins / builds, so it would probably be best to include it in a major release like 5.0 instead of a minor release like 4.10.

@st_oehme:

Added:

  1. JUnit 4, JUnit 5, & TestNG support
  2. javadoc support
  3. no longer fails if the main source set does not have a module-info.java; in this case, uses class-path instead of module-path

Current branch is javadoc-task-support

I am working on supporting modules in more complex situations, like supporting custom source sets (i.e. ones other than main & test), supporting multiple code roots (with possibly multiple module-info.javas per task), more languages (like using -Xmodule-path for KotlinCompile, supporting Groovy & Scala, etc.).

I am still implementing this by retroactively applying the correct task arguments in a Task#doFirst(), then, in the same Task#doFirst(), removing incorrect arguments, like clearing the class-path.

Retroactive fixes will probably be harder, and possibly impossible, in more obscure corner cases.

It might be best for:

  1. me to get to a decent stopping point that supports modules in more circumstances, but that doesn’t cover all circumstances.
  2. us to do a PR soon to at least share my improvements with all jigsaw-experimental users (we could do one PR for the existing javadoc-task-support branch, and, subsequently, another for my current unfinished improvements)
  3. me to investigate Gradle’s internals
  4. someone to architect a solution to properly configure module dependencies at configuration time, to avoid retroactive fixes at execution time (I can work on this, but any insight from the Gradle experts would speed and improve the design, and would be greatly appreciated)
  5. someone to implement it (I assume I could do this, if the scope of solution isn’t too broad or too invasive into the Gradle core code)

I won’t have time to go into detail on all these points, but here are some high level thoughts:

I don’t think we should be separating dependencies into modulepath and classpath. We should either use one or the other, depending on whether a module-info.java file is present. Otherwise we’ll get a hugely complex solution that only helps in some corner cases. Those corner cases are better served by writing some logic themselves I think.

For defining inputs that the task itself doesn’t yet provide (like the module path), have a look at CommandLineArgumentProvider. It allows you to add complex objects as inputs to the task, defining their properties like normalization strategy and path sensitivity and then converting them to command line arguments when needed. This should remove any need for doFirst {} actions.

If you want to get improvements merged back into the jigsaw plugin I recommend starting small, one feature at a time. We won’t get one giant PR merged in a timely fashion, as it would require lots of discussion on every feature and its implementation. Better have some improvement sooner than everything much later.

1 Like

@st_oehme

Thanks for all the help.

I have kept all my changes in serial branches (each branch branches from the previous branch, rather than from master), so I can make a PRs one-at-a-time (for each subsequent branch). Once a PR is accepted, I’ll submit the next PR. I already submitted some simple PRs a while ago, but no one has responded. Can you put me in touch with someone who can help? I’ll provide a list of major changes so you guys can see that it’s worth your time to get the new features.

Some features that are on my todo list (in general order of intent to implement):

  1. BLOCKED (awaiting kotlinc -Xbuild-file XML documentation): JPMS support for Kotlin plugin. This is my current task. I’m only working on KotlinCompile, but I can possibly work on other tasks from the Kotlin plugin, if they exist, and if their base tools support JPMS. The implementation of this is waiting on me finding the documentation for the XML format for the kotlinc -Xbuild-file argument to fix an issue, so I’ll progress to other tasks while waiting for a link to the docs.
  2. COMPLETED: retooling how compile test tasks are configured on a test platform basis to easily support multiple compile test Tasks.
    1. without modifying Gradle core, we’ll need to update the plugin whenever Gradle supports an additional TestFramework.
    2. with a minor addition to Gradle core, the plugin can be retooled to automatically support all additional TestFrameworks whenever they’re added, without modifying the plugin on a per-TestFramework basis. I’ll discuss the Gradle core changes after my existing changes have been merged into the plugin
  3. CANNOT WORK (awaiting response to my post below): using CommandLineArgumentProvider (please see the note at end of this post)
  4. providing a DSL to allow build scripts to manually setup various JPMS options
  5. cleanup TaskConfigurer SPI (potentially change TaskConfigurer methods & JigsawPlugin properties; possibly ensure that only one TaskConfigurer can match a Task)
  6. JPMS support in tasks for other languages, e.g., Groovy, Scala, if their tools support JPMS
  7. jlink support
  8. multi-release jar support: I know that variant-aware dependency resolution is better, but, if supporting multi-release jars is simple, and if some people have a valid use case for them, then I can add this feature
  9. optionally verifying module naming standards compliance (module name = common package ancestor name) (this was in the Chainsaw plugin, so, if people want it, I can add it)

CommandLineArgumentProvider note:
One issue is that certain paths in the class path must be relocated to the module path, and others to argument options like --patch-module. I already autodetect certain things that should be moved, but there might be cases where a user must put something in --patch-module that I cannot detect. I assume that in this case, I should provide a DSL for a user to explicitly add a path to --patch-module, but, in the background, that will add the item to the task’s classpath, and also add an entry to a SetMultimap that maps from a path to the options (e.g., --patch-module) to which it should be applied. My CommandLineArgumentProvider will move all paths in the SetMultimap from the classpath to the appropriate options (if a path is not in the SetMultimap, then my automatic logic will be used instead, using an algorithm to determine where the path should be used).

Existing improvements (I might have missed a few):

  1. supports multiple source directories per source set
  2. supports multiple source sets
  3. configures all tasks of supported types (e.g. JavaCompile) instead of just tasks with specific names (e.g., "compileJava")
  4. has TaskConfigurer ServiceLoader SPI to support configuring additional Tasks (e.g., KotlinCompile, GroovyCompile, etc.); thus, can include implementations like KotlinCompileTaskConfigurer that will be used if the Kotlin plugin also applied to the target build, but that won’t break the build if the Kotlin plugin is not applied
  5. supports JUnit 4, JUnit 5, & TestNG. Support can be added for other test frameworks, but haven’t yet made it a pluggable SPI
  6. supports Javadoc task
  7. supports compiling a module-info.java that exports a package that has no Java code, but that has code from another language (e.g., Kotlin)
  8. parses module name from module-info.java instead of requiring it to be duplicated in javaModule.name in build.gradle(.kts) (parsing done only after task graph has been created, and only if it contains a task that must be modified by this plugin; parses multiple module-info.java files in parallel using Worker API)
  9. allows source sets to not have module-info.java (i.e. doesn’t report exception, leaves class-path as is)
  10. appends javac module arguments to existing arguments instead of replacing existing arguments
  11. only adds javac module options that require a value if the value is non-empty
  12. fixes a potential doFirst() ordering issue
  13. CreateStartScripts configurations correctly replaces all placeholders instead of only the first per line
  14. general code cleanup, including using Gradle 4.9 Kotlin DSL (instead of 4.1 RC 1 Groovy DSL), updating dependency versions, using lambdas, using streams, etc.

@st_oehme

The first problem I see with using CommandLineArgumentProvider instead of doFirst() is that I need to read AbstractCompile#getClasspath() to construct the other command line arguments, but I also should set the classpath to empty.

JavaCompile#createSpec() reads getClasspath() before JavaCompile#compileOptions.getAllCompilerArgs() is called, so the classpath is already set before any CommandLineArgumentProvider that I’ve supplied is ever run.

If spec.setCompileOptions(compileOptions); is moved to be the first call after the following line, then I think I can use CommandLineArgumentProvider for JavaCompile:

final DefaultJavaCompileSpec spec = new DefaultJavaCompileSpecFactory(compileOptions).create(); 

The second problem that I see is that not all Tasks use CommandLineArgumentProvider (e.g., CreateStartScripts), so I can’t use that for them. Maybe those Tasks have some other similar mechanism, but I’d need to learn about that, too.

I’ll still probably use existing classpath properties with the SetMultimap that I proposed above for the DSL for manual module options, but, for now, it will be backed by doFirst() instead of CommandLineArgumentProvider, since the former works for all Tasks, and doesn’t require any changes to core Gradle.

Please let me know if I’ve missed something.

You wouldn’t read the classpath property on the task, but SourceSet#compileClasspath. If someone messes with JavaCompile#classpath directly I don’t think they’d expect your plugin to work.

If some task doesn’t provide CommandLineArgumentProvider, you can use doFirst as a workaround, but please feel free to open feature requests.

1 Like

@st_oehme

Thanks.

To simplify things, I’ll only mention JavaCompile below, but similar actions must be taken for all Tasks supported by the plugin.

The plugin cannot read SourceSet#compileClasspath because, e.g., the Kotlin plugin adds the KotlinCompile#destinationDir to JavaCompile#classpath in a JavaCompile#doFirst().

Questions:

  1. is there any way to not output a -classpath without clearing JavaCompile#classpath? If so, then for JavaCompile, the plugin can use a CommandLineArgumentProvider without any modifications to Gradle core.

    1. JavaCompilerArgumentsBuilder#includeClasspath(boolean) can’t be used because the JavaCompilerArgumentsBuilder is instantiated and used in a single method, and there’s no way to call includeClasspath(false) on it
  2. if not, then either:

    1. JavaCompile#setGenerateClasspathArgument(boolean) (which defaults to true) could be added to disable generating -classpath without having to clear JavaCompile#classpath

    2. the plugin must clear JavaCompile#classpath, which can be done in:

      1. a JavaCompile#doFirst() (current behavior inherited from original plugin version)

      2. a CommandLineArgumentProvider, only if JavaCompile#createSpec() is modified so that, in:

spec.setCompileOptions(compileOptions);

is moved to be the first call after the following line:

final DefaultJavaCompileSpec spec = new DefaultJavaCompileSpecFactory(compileOptions).create(); 

The plugin cannot clear JavaCompile#classpath earlier than a JavaCompile#doFirst() because of the Kotlin plugin issue mentioned above.

The plugin cannot clear JavaCompile#classpath in a CommandLineArgumentProvider without modifications to Gradle core because JavaCompile#createSpec() currently reads classpath before it calls CommandLineArgumentProvider#asArguments(), so, if the plugin clears classpath in that method, that would clear it only after it’s already been read, so javac will still receive a -classpath.

If Gradle core must be modified, then the plugin cannot work with older versions of Gradle. What is the oldest version of Gradle that supports compiling Java 9 source?

If Gradle core must be modified, adding JavaCompile#setGenerateClasspathArgument(boolean) is nicer than modifying JavaCompile#createSpec().

So, barring new suggestions for you (or from anyone else), and barring modifications to Gradle core, doFirst() is our best option, and is the same mechanism as the original plugin.

I guess we can live with that hack a little longer. Make sure you still define all the inputs properly, because whatever you do in doFirst {} will not be visible when snapshotting the inputs.

1 Like

So far, I’m just clearing out JavaCompile#classpath, and using its contents in different tool options without adding any new inputs.

Once I add the DSL for manual Jigsaw configuration, however, I’ll need to handle them correctly.

Should I save a copy of JavaCompile#classpath in the JavaCompile#doFirst {} and then re-set JavaCompile#classpath to the copy in a JavaCompile#doLast {}?

It would probably be safer to do that, because some other Task might read JavaCompile#classpath later.

Right now, I’m cleaning up my existing commits to ease PR reviews. I’ll create a PR when everything is ready for the current state of my code. Then I’ll continue adding more features.

1 Like