Incremental build for aggregated tasks

Hello,

I have ran into this situation where I have following task dependency:
taskA.dependsOn taskB .

I have defined for both tasks incremental constraints which works perfect.
I would like define to modify the incremental constraints of taskB in way so that taskB will be also executed when taskA is not up to date. Is that possible?

Thank you and kind regards
Markus

Hi Markus,

please tell us more about the problem you are trying to solve. You are basically asking how to create a cyclic task dependency, which I’m sure is not what you really want :slightly_smiling:

Cheers,
Stefan

Hi Stefan,

indeed I am not interested in creating a cyclic task dependency. :wink:

Just as an example, I would like do following:
As part of our acceptance tests we setup the whole application and run the acceptance tests. As part of the setup we have two tasks: dropDatabase and setupDatabase. The task dropDatabase checks if an accordant database schema is available (as a an output of a previous run for example) and if their is one, then the tasks drops the schema. Afterwards the task setupDatabase setups the database The task setupDatabase setups a new databasae schema and executes the same scripts that setups a database for a productive environment.

Therfore our build defines currently “setupDatabase.dependsOn dropDatabase” incl. incremental constraints. Lets assume the first build run executes task dropDatabase successful, but the task setupDatabase fails. The result will be that the schema exists partly. The subsequent incremental build identifies currently the task dropDatabase as up to date and task setupDatabase is not up to date. I would like to execute task dropDatabase if and only if setupDatabase is not up to date. Is this possible?

Markus

Sounds like the up-to-date check for dropDatabase is not sophisticated enough yet. What are you checking for currently?

My first approach was to write the tasks like the example below. Both tasks writes the output into the same directory and this directory is also the output condition for the incremental build. In the meantime I saw that the outputs of task dropDatabase will be not affected when a subsequent task adds files to the same directory.

task dropDatabase(type:Exec, dependsOn: extractDbSchema) { inputs.files tasks.getByPath('extractDbSchema').outputs.files def outputDir = file("${buildDir}/reports/dbsetup") outputs.dir "${outputDir}"
// failures will be ignored
ignoreExitValue = true

workingDir "${buildDir}/dbschema/database_description/scripts/"
commandLine ("sh", "deleteAll.sh") //script checks, if schema already exists and if it exists, then the schema will be dropped by the script

standardOutput = new ByteArrayOutputStream()
errorOutput = new ByteArrayOutputStream()

doLast {
    outputDir.mkdirs()
    file("${outputDir}/dbdrop.log") << standardOutput
    file("${outputDir}/dbdrop.err.log") << errorOutput
    if (execResult.getExitValue() != 0) {
        throw new GradleException("Please compare for further information the log output located in ${outputDir}!")
    }
}

}

task setupDatabase(type:Exec, dependsOn: dropDatabase) {
inputs.files tasks.getByPath(‘dropDatabase’).inputs.files
def outputDir = file("${buildDir}/reports/dbsetup")
outputs.dir “${outputDir}”

// failures will be ignored, but as part of doLast handled separately
ignoreExitValue = true

workingDir "${buildDir}/dbschema/database_description/scripts/"
commandLine ("sh", "createAll.sh") //creates a new database schema 

standardOutput = new ByteArrayOutputStream()
errorOutput = new ByteArrayOutputStream()

doLast {
    outputDir.mkdirs()
    file("${outputDir}/dbsetup.log") << standardOutput
    file("${outputDir}/dbsetup.err.log") << errorOutput
    if (execResult.getExitValue() != 0) {
        throw new GradleException("Please compare for further information the log output located in ${outputDir}!")
    }
}

}

I also tried to define the outputs of dropDatabase with

outputs.upToDateWhen { tasks.getByPath('setupDatabase').state.getUpToDate() }

and with

onlyIf { return !tasks.getByPath('setupDatabase').state.getUpToDate() }

Both without success.

That can’t work, since dropDatabase runs first.

In short, you want both tasks to either fail or succeed together. I think the simplest solution for this case would be to have one task to do both things, something like updateDatabase.

Another possible solution would be to have a “lastFailure” file which is an input for dropDb and an output of setupDb. setupDb would only write this file if it fails, which would then make sure dropDb is considered out of date again the next time.

Hi Stefan,

thanks a lot for your help. The idea with the “lastFailure” file is working very well.

Regards
Markus