Java-library plugin: why custom sourceSet's 'api' dependency handler unavailable

I am trying to use java-library which in a DB driver project which acts as a stub/DAO for the actual DB implementation.

My project doesn’t need the actual implementation of the DB like sqlite, postgres etc, but instead require the certain implementation libraries to be present on the runtimeClasspath of the consumer so that they need not explicitly require for the dependency.

I am aware that an API project need not export its implementation library as transitive dependency.
But I am trying to do it to test how the transitive dependencies are passed over to consumer.

Problem: api dependency handler unavailable for custom sourceSet.

build.gradle

plugins {
    id 'java-library'
}

repositories {
    mavenCentral()
}

sourceSets{
    driver
}

dependencies {
    driverImplementation 'org.postgresql:postgresql:42.6.0'
    driverImplementation 'org.xerial:sqlite-jdbc:3.41.2.0'
    api 'org.postgresql:postgresql:42.6.0'
    api 'org.xerial:sqlite-jdbc:3.41.2.0'
}

jar{
    from sourceSets.driver.output
}

what I am expecting

plugins {
    id 'java-library'
}

repositories {
    mavenCentral()
}

sourceSets{
    driver
}

dependencies {
    driverImplementation 'org.postgresql:postgresql:42.6.0'
    driverImplementation 'org.xerial:sqlite-jdbc:3.41.2.0'
    driverApi 'org.postgresql:postgresql:42.6.0'
    driverApi 'org.xerial:sqlite-jdbc:3.41.2.0'
}

jar{
    from sourceSets.driver.output
}

Reason for Question: Expecting custom sourceset and its configurations/dependencies to be completely isolated from main sourceset and its configurations/dependencies.

Any clarification on why customApi dependency Handler is unavailable, will be helpful.

Well, it is not isolated in your use-case, as you pack the output of the driver source set into the artifact that is produced by the main source set and thus you are wedding them.

Your example will either way not really work properly.
With you current setup (ignoring the api tries), the consumer that depends on the main artifact will miss the driver classes on its runtime classpath, as you don’t have them as implementation dependencies of the main source set, but only on the implementation dependencies of the driver source set.
So having a driverApi configuration would be as pointless also.

I don’t think you are taking the right approach to whatever you actually want to achieve to be honest.

Thanks for responding quickly,

I do understand that artifact jar task packages the output of driver sourceset and makes it available to the consumer via implementation project(":DatabaseDriver") and doesn’t isolate it from main as I am trying to do so.

I can do create a custom configuration and associate the artifact with the configuration so that the consumer can consume from that configuration.

DatabaseDriver → build.gradle

plugins {
    id 'java-library'
}

repositories {
    mavenCentral()
}

configurations {
    driverApiJar
}

sourceSets{
    driver
}

dependencies {
    driverImplementation 'org.postgresql:postgresql:42.6.0'
    driverImplementation 'org.xerial:sqlite-jdbc:3.41.2.0'
    driverApiJar 'org.postgresql:postgresql:42.6.0'
    driverApiJar 'org.xerial:sqlite-jdbc:3.41.2.0'
}

jar{
    enabled = false
}

task driverJar(type: Jar){
    from sourceSets.driver.output
}

artifacts {
    driverApiJar driverJar
}

Consumer → build.gradle

plugins {
    id 'java'                   //used to support java features
    id 'application'            //used to create java CLI application
}

dependencies {
    implementation project(path: ":DatabaseDriver", configuration: "driverApiJar")
}

application {
    mainClass = 'org.example.consumer.Main'
}

sourceSets{ main }

I can rename the custom configuration to be driverApi instead of driverApiJar and that almost resolves my issue since the dependencies are available on the custom configuration classpath.

Thus when a consumer requests only the specific configuration then they can get the dependency implicitly.

As I had mentioned I just wanted to know why it was done this way, ie only ‘api’ dependency handler is available for java-libraryplugin.

if it was available then I could get away without creating a custom configuration and associating artifact with the config etc as shown below.

DatabaseDriver → build.gradle

plugins {
    id 'java-library'
}

repositories {
    mavenCentral()
}

sourceSets{
    driver
}

dependencies {
    driverImplementation 'org.postgresql:postgresql:42.6.0'
    driverImplementation 'org.xerial:sqlite-jdbc:3.41.2.0'
    driverApi 'org.postgresql:postgresql:42.6.0'
    driverApi 'org.xerial:sqlite-jdbc:3.41.2.0'
}

jar{
    from sourceSets.driver.output
}

Consumer → build.gradle

plugins {
    id 'java'                   //used to support java features
    id 'application'            //used to create java CLI application
}

dependencies {
    implementation project(":DatabaseDriver")
}

application {
    mainClass = 'org.example.consumer.Main'
}

sourceSets{ main }

How this helps: Consumer need not know about any custom configurations and can rely on default configuration and default artifact from jar task.

I can merge all the outputs in the jar task so that consumer just needs a single jar whereas I can have multiple sourceSets in my library config.

If I get away with declaring the available “api” dependency handler then it still works, but doesn’t distinguish which transitive api dependency is targeted for which sourceSet (association becomes unclear).

What works now:

DatabaseDriver → build.gradle

plugins {
    id 'java-library'
}

repositories {
    mavenCentral()
}

sourceSets{
    xxxx
    yyyy
    zzzz
    ...
}

dependencies {
    // xxxx sourceSet internally consumed implementation and externally exported api dependency 
    xxxxImplementation 'aaaa.bbbb.xxxx:1.0'
    xxxxImplementation 'aaaa.bbbb.yyyy:1.0'
    api 'aaaa.bbbb.xxxx:1.0'
    api 'aaaa.bbbb.yyyy:1.0'

    // yyyy sourceSet internally consumed implementation and externally exported api dependency 
    yyyyImplementation 'cccc.bbbb.xxxx:1.0'
    yyyyImplementation 'cccc.bbbb.yyyy:1.0'
    api 'cccc.bbbb.xxxx:1.0'
    api 'cccc.bbbb.yyyy:1.0'

    // zzzz sourceSet internally consumed implementation and externally exported api dependency 
    zzzzImplementation 'dddd.bbbb.xxxx:1.0'
    zzzzImplementation 'dddd.bbbb.yyyy:1.0'
    api 'dddd.bbbb.xxxx:1.0'
    api 'dddd.bbbb.yyyy:1.0'

   ...
}

jar{
    from sourceSets.xxxx.output
    from sourceSets.yyyy.output
    from sourceSets.zzzz.output
    ...
}

What I am requesting for

DatabaseDriver → build.gradle

plugins {
    id 'java-library'
}

repositories {
    mavenCentral()
}

sourceSets{
    xxxx
    yyyy
    zzzz
    ...
}

dependencies {
    // xxxx sourceSet internally consumed implementation and externally exported api dependency 
    xxxxImplementation 'aaaa.bbbb.xxxx:1.0'
    xxxxImplementation 'aaaa.bbbb.yyyy:1.0'
    xxxxApi 'aaaa.bbbb.xxxx:1.0'
    xxxxApi 'aaaa.bbbb.yyyy:1.0'

    // yyyy sourceSet internally consumed implementation and externally exported api dependency 
    yyyyImplementation 'cccc.bbbb.xxxx:1.0'
    yyyyImplementation 'cccc.bbbb.yyyy:1.0'
    yyyyApi 'cccc.bbbb.xxxx:1.0'
    yyyyApi 'cccc.bbbb.yyyy:1.0'

    // zzzz sourceSet internally consumed implementation and externally exported api dependency 
    zzzzImplementation 'dddd.bbbb.xxxx:1.0'
    zzzzImplementation 'dddd.bbbb.yyyy:1.0'
    zzzzApi 'dddd.bbbb.xxxx:1.0'
    zzzzApi 'dddd.bbbb.yyyy:1.0'

   ...
}

jar{
    from sourceSets.xxxx.output
    from sourceSets.yyyy.output
    from sourceSets.zzzz.output
    ...
}

In both the cases consumer should simply depend on the project alone. ie, output just from jar task.

Consumer → build.gradle

plugins {
    id 'java'                   //used to support java features
    id 'application'            //used to create java CLI application
}

dependencies {
    implementation project(":DatabaseDriver")
}

application {
    mainClass = 'org.example.consumer.Main'
}

sourceSets{ main }

NOTE: The question is still about the “api” dependencyHandler that is unavailable from gradle for custom soureSet.

if it was available then I could get away without creating a custom configuration and associating artifact with the config etc as shown below.

Again, no, you wouldn’t, as you do not consume driverApi or driverImplementation with that approach, because you just package the compiled files into the normal jar. So you would anyway have to declare it on api when using such a dirty setup that does not really make much sense. And as it does not make sense, it is also not provided by the java-library plugin.

What you actually might be after are feature variants. There you also have the proper ...Api and ...Implementation configurations and consume them properly in downstream projects: Modeling feature variants and optional dependencies

How this helps : Consumer need not know about any custom configurations and can rely on default configuration and default artifact from jar task.

Then just throw your classes into the main source set.
There is not much point in separating them into a separate source set if you then want to consume them together.

Otherwise I guess - but didn’t try - you could probably achieve what you want by using

configurations {
    runtimeElements.extends driverImplementation
}

in your producer project.

NOTE: The question is still about the “api” dependencyHandler that is unavailable from gradle for custom soureSet.

Yeah, well, the answer is, that it makes no sense to provide it, as it wouldn’t be wired to anything.
Source sets are not meant to be merged like that.
And if you do, you also have to set up other parts you need to adjust to that, but it should be as simple as the extends I shown I think.

Hey @Vampire,

Thanks for responding to the “why so” part that it wouldn’t be useful and/or that other gradle features such as “feature variants and optional dependencies” can cover what I am requesting for.

Btw, I have an equivalent domain project that we are adapting and is much more complex(it was using ANT build) since we do not want to organize the source code directories at this point until release and that is the reason we are trying to go with this route. The build.gradle couldn’t be shared due to NDA reasons.

Thanks again for your prompt response I will check with the team to adapt the request for isolation using “feature variants”

Just FYI, the ANT interlinked/composite build didn’t have such issue since everything is verbose and different artifacts/sources/classpaths of producer can directly be plugged into required places in consumer targets.

we do not want to organize the source code directories at this point until release and that is the reason we are trying to go with this route

Speaking as someone who’s just reaching the end of that particular road on my own project, the approach I took to this was to create a parallel directory structure containing the new Gradle build, and create a script that symlinks source files from the old Ant build into the new location. That way, the people working on the new build can do so without impacting those working on the application code… you just need to re-run the script every time you merge incoming code.

I’m now a few days away from the second part of that, which is running the modified version of the script which moves twelve thousand source files to their permanent new home. That’s going to be a real pain to commit — but I’ve been able to spend months perfecting the new build, getting the Gradle build files reviewed and merged and running on testbeds, so the final step should be relatively painless.

1 Like

Thanks for the idea, I think that would keep things much more isolated and will smoothen the process to migrate to gradle build(with its recommended design) without having to worry about retaining the old structure in initial phase.

Please do post how the final transition went through

The code will be merged in a few days, but I’m not anticipating any serious problems at this point… the big merge is going to be a headache, but that’s mostly around coordinating with all the other developers to ensure that the codebase is stable and passing tests before I run my final conversion.

Changing the script from “symlink” to “git move” turned up a few minor conversion bugs — mostly involving cases where I’d been unintentionally relying on several duplicate symlinks being created, something which fails when the original source file is moved. But again, that’s something that can easily be tested in a local dev environment… it’s a bit slower than the symlink version, but it’s still easy enough to run the conversion again and again until I’m confident it’s acceptably close to the old Ant build.

Note that I have been merging changes as much as possible along the way… e.g. if moving a Java class to a new package made it easier to classify correctly (this was especially the case for tests vs test fixtures), that stuff all gets reviewed and merged as soon as possible. The final merge will be massive, but critically, it’s all automated bulk moves — nothing needs to be reviewed, because not a single line of code is changing. The bits that need review — the code changes, the new build scripts — are already reviewed and merged.

1 Like

understood, thanks for the update, I guess I am too late in my journey to try that out but will keep that in mind when approaching such kind of projects in future.