Possible to dynamically change the assignment artifacts to configurations?

Hey guys,

I want to dynamically change the assignment artifacts to configurations. Only if the tasks has been executed, it’s artifact should be a part of the configuration

I tried something like this,


but the artifact closure will be “executed” in the configuration phase and there are logically no information about executed tasks.

Can somebody help or has somebody an idea, how to solve this problem?

Thanks in advance!!!

Why do you want to do this? What’s the bigger picture?

In certain circumstances, some jar tasks shouldn’t be executed. Therefore, I have to know which of them I have to add to configuration ‘archive’, because:

If I add every task to configuration ‘archive’ and execute task uploadArchive, the ivy.xml in artifactory will contain the artifacts of all tasks. But it should only contain the information about the artifacts of the executed jar tasks. If the ivy.xml contains information about the non-executed tasks and its artifacts and the artifact hasn’t been uploaded (because the task has been skipped for some reason) gradle will throw an exception because gradle couldn’t find the artifact in artifactory repositories.

I hope I could give you a summary!?

Having the same ivy.xml/configuration refer to different sets of artifacts depending on certain circumstances sounds problematic to me. What are these circumstances? How about publishing multiple configurations with different sets of artifacts? I think it’s better to split things up than to rely on conditional logic here.

There is a task, which assemble .ddl files for example. But only if the component have .ddl files the task should be executed and its artifacts should be added to configuration archives.

How about only adding the dll task (and registering its artifacts) for projects that produce .dll files? You could put this logic into gradle/dll.gradle and selected projects could apply it with apply from: “$rootDir/gradle/dll.gradle”.

We solved (I hope so) as follows:

tasks.withType(Jar).all {Jar task ->
 afterEvaluate {
  if (task.onlyIf.isSatisfiedBy(task)) {
   switch(task.name) {
    case ['jarAPI', 'jarDDL', 'jarTest', 'jarSources']:
                 artifacts.add('archives', task)
    case ['jarAPI']:
                 artifacts.add('compile', task)
    case [ 'jarTest']:
                 artifacts.add('testCompile', task)

That’s the kind of solution I wanted to save you from. Reliance on onlyIf at least means that you can configure the artifacts early enough for autowiring to kick in. I wonder what kind of logic you have in your onlyIf blocks…

The bigger picture is a “multi-multi-project”. This means we have multiples workareas for developers and each workarea has multiple projects. Each project has a defined type (client, server, custom and so on). The type defines the set of artifacts to publish. Because we do not want to define the artifact set for every small project, we define the “apply-rules” in a common include script. Our onlyIf conditions checks in general, if there are files to process:

task jarSources(type:Jar){
                classifier = 'src'
                from project.file("${projectDir}/src")
                onlyIf { !fileTree(dir: "${projectDir}/src", include: '**/*.java').isEmpty() }

What are the concrete disadvantages of the suggested solution?

If the project type defines the set of artifacts, you could have one plugin per project type and let that plugin declare the necessary tasks and artifacts. Your solution may work but is less clean.