Project dependency with multiple jars

Ok, first off, I’ve searched the forum and user-guide thin, and I’ve seen all the suggestions that go like: “Don’t do it like that, refactor it into multiple projects instead”, but that is simply not feasible in my setup (yet).

So, I have this clusterfrak of a legacy project, consisting of a zillion modules, grouped into 3 major projects, with circular dependencies and you-name-it, if it smells, we have it!

But, I’m trying to nudge this into the right decision, and big bang is not gonna cut it.

So, so far I’ve managed to take our “core-project” and hack and slash that from Ant+Makefiles+Bashscripts+Velocity+Java-calling-ant+xsd-over-jasper2… and more into gradle.

So far so good. Now I have a nifty gradle core-gradle project, producing 22 different jars/wars and ears. Some of the jars are for client side, some are for client-side-over-jnlp-needs-signing etc. etc.

Now I need to get another project (in the same multi-project setup) depend on some of the artefacts produced in the core-project.

How do I do that?
It seems:

dependencies {
  project (':core)
}

Doesn’t cut it, and stuff like:

dependencies {
  files('../core/build/lib/myawesome.jar')
}

Might get me there, but requires hacking and slashing to get the task-dependencies right.

So, how do I express a project-dependency on an output-artefact of another project?
And (more importantly) what is required for it to work?

PS: I have really tried to make the core project correctly declare it’s artefacts (all jars should be added to various properties, like sourceSets.main.outputs and to the archives artefacts.

Possibly something like:

configurations {
   temp
} 
dependencies {
   temp project(':core') 
   compile configurations.temp.filter { it.name == "myawesome-${project.version}.jar" } 
} 

Or like

configurations {
//declaring new configuration that will be used to associate with artifacts
schema
}

task schemaJar(type: Jar) {
//some imaginary task that creates a jar artifact with some schema
}

//associating the task that produces the artifact with the configuration
artifacts {
//configuration name and the task:
schema schemaJar
}

And in the other project

dependencies {
compile project(path: ‘:a’, configuration: ‘schema’)

Thank you for the answers.

It seems to push me in the right direction.

Finally got a liiiitle further.

Gradle is really giving me gray hairs.

It’s AWESOME! and then it really sucks big time… for a long time… and then the sun gives a ray of sunshine, and the bipolar-disorder-relationship is back being awesome.

I have a lot of “this code” in my core project gradle:

task jarExternal(type: Jar) {
    includes= ["com/core/interfaces/**"]
}
task jarSetup(type: Jar) {
    includes = ['com/core/setup/**']
}
task jarCertAuthValve(type: Jar) {
    includes = ['com/core/auth/CertificateAuthValve*']
    archiveName = 'certificate-authvalve'
}

And then I tie it together using this:

// Make sure all the jar-tasks above is injected into the dependency-hierarchy, and uses the main output.
tasks.withType(Jar).matching { task -> task.name.startsWith('jar') && task.name != 'jar' }.each { task ->
    task.dependsOn += compileJava
    war.dependsOn += task
    task.from sourceSets.main.output
    if (!task.archiveName || task.archiveName == "${name}.jar") {
        task.archiveName = task.name.substring(3).toLowerCase() + ".jar"
    }
    artifacts.add('jars', task)
    task.description = "gather classes for jar-file $task.archiveName"
}

Expressing the jars this way seemed nice and nifty.

Except.
Apparently the above code isn’t executed before the other projects, that try to depend on the core module, meaning stuff like:

// other-project.gradle
dependencies {
    compile project(path: ':core', configuration: 'jars')
    println "Jars: " + project(':core').configurations.jars.allArtifacts
}

Yields an empty list, but doing stuff like this:

task helloWorld << {
    println "Jars: " + project(':core').configurations.jars.allArtifacts
}

Shows the expected jars.

So, I assume the tasks.withType clause in the core module somehow needs to be executed earlier in the project-evaluation phase, or I somehow need to force evaluation of the core project before the sub-projects.

But how?

This could probably be cleaned up by putting all your config in a list

def jarConfigs = [
   [ taskName: 'jarExternal', includes: [...], archiveName: 'external' ],    
   [ taskName: 'jarSetup', includes: [...], archiveName: 'setup' ],
   [ taskName: 'jarCertAuthValve', includes: [...], archiveName: 'certificate-authvalve' ] 
] 
jarConfigs.each { config ->
   task config.taskName {
      type: Jar
      includes: config.includes
      archiveName: config.archiveName
      etc
   } 
} 

That is just opinion, will that change the execution flow? Fix my problem?

Correct, that was just a suggestion to cleanup the code and doesn’t fix your problem. You could try a closure to delay the execution.

Eg:

dependencies {
   compile { project(path: ':core', configuration: 'jars') }
} 

Closures on project dependencies does not seem to cut it.

But this helped me (though not really beatiful)

dependencies {
  // third party dependencies
}
afterEvaluate {
    dependencies {
      compile project(path: ':core', configuration: 'jars')
    }
}

I cannot say for sure, that this approach doesn’t have some nasty side-effects.

I’m unsure if this would force the archives to be calculated earlier, but using my list of config approach (see above) you could try

artifacts {
   jarConfigs.each {
      jars project.tasks.getByName(it.taskName)
   } 
} 

I wouldn’t be able to say for sure, whether your approach would have some nasty side-effects too.

The problem is, using “dynamic” behaviour on tasks/inputs/outputs/artefacts, and then be able to use those in dependency resolving.

By the way, why do you feel, that introducing a list of configuration-items is better than using the task-data-model - that is there for the same, already?

IMO you just introduce “a new proprietary model” to handle task-creation, instead of reusing the gradle-model.

If you want to do the same without a loop it would be:

artifacts {
   jars jarExternal
   jars jarSetup 
   jars jarCertAuthValve 
} 

As for my preference for a list of config objects

  1. You can loop the list multiple times (eg once for task declaration, and again for artifact declaration)
  2. It’s DRY
  3. It’s easy to understand the intent from the list
  4. I dislike magic and I feel your archiveName calculation borders on magic. I’d favour an explicit archiveName declaration in a config object.

As you said, it’s personal preference.

You call it magic, I call it convention :slight_smile:

I.e. jarSetup(type: Jar) will generate an archiveName of setup.jar, by convention :slight_smile:

It does seem (however), that it’s the

tasks.withType(Jar).matching

That is causing me grief, esp. the “registration” on the project.artifacts.

Here’s another bit of magic that can be avoided by my list of config approach

tasks.withType(Jar).matching { task -> task.name.startsWith('jar') && task.name != 'jar' }

Oops… I meant convention :wink:

Suit your self m8 - you like data-structures in your build-scripts, I like re-usable logic.

We are not going to agree on this point, and it’s drawing focus away from the initial post.

Which is: How can I express inter-project dependencies on artifacts dynamically assigned during the configuration phase, before execution phase?

There is an incubating feature for ordering tasks, but I haven’t seen a feature for ordering projects.

Have you tried my explicit artifacts { ... } block?

artifacts {
   jars jarExternal
   jars jarSetup 
   jars jarCertAuthValve 
}

If that fails I promise I’ll shut up :smile:. If it works, I think my list based approach will work too.

I think I’ve found the fix, add the following line to the consumer build.gradle

evaluationDependsOn(':core') 
1 Like

You are my hero - think that was what I was looking for…

Now, on to the next head-ache… :smiley: