Transitive version constraints dosen't work as promised

Below is my build.gradle
I’m using spark version 3.1.1 for scala 2.12 and expect to see only scala 2.12.10 in my dependency tree.
In fact there’s still 2.12.15 inherited from delta-core. Why?
What should I do to restrict my build to a single version of dependency?


plugins {
    // required to read ~/.m2/settings.xml with azure credintials
    id "net.linguica.maven-settings" version "0.5"
}

// setting up custom repos
apply from: "gradle/repositories.gradle"
apply from: "gradle/methods.gradle"

allprojects {
    apply plugin: "scala"
    apply plugin: "net.linguica.maven-settings"
    apply plugin: "java-library"

    task pack(type: Sync, dependsOn: "jar") {
        from configurations.default
        into "$buildDir/dependencies"
    }

    //define "provided" scope
    configurations { provided }
    sourceSets {
        main.compileClasspath += configurations.provided
        test.compileClasspath += configurations.provided
        test.runtimeClasspath += configurations.provided
    }
        configure(project) {
            // spark is provided with the container, so we can safely add it to all projects
            dependencies {
                provided(spark.core)
                provided(spark.sql)
                provided(spark.streaming)
                provided(spark.sql.kafka)
                provided(spark.avro)
                "io.delta:delta-core_$scalaMajorVersion:1.0.0")
                constraints {
                    implementation("org.scala-lang:scala-library") {
                        version strictly: "2.12.10"
                        because "it's provided with Spark"
                    }
                    implementation("org.scala-lang:scala-reflect") {
                        version strictly: "2.12.10"
                        because "it's provided with Spark"
                    }
                }
            }
        }

Besides that this snippet is sprinkled with all sort of bad practice, it will not even compile.
Please at least provide the actual content or any statements are just guesswork.
Optimal as always would of course be an MCVE.

yeah, sorry.
Here’s a project that would compile.
And what are the bad practices here? I’m new to gradle as you could guess :slight_smile:

plugins {
    // required to read ~/.m2/settings.xml with azure credintials
    id "net.linguica.maven-settings" version "0.5"
//    id "com.linecorp.build-recipe-plugin" version "1.1.1" apply false
}
apply plugin: "scala"
apply plugin: "net.linguica.maven-settings"
apply plugin: "java-library"

repositories {
    mavenCentral()
    maven {
        url "http://packages.confluent.io/maven/"
        allowInsecureProtocol true
    }
}

task pack(type: Sync, dependsOn: "jar") {
    from configurations.default
    into "$buildDir/dependencies"
}

//define "provided" scope
configurations { provided }
sourceSets {
    main.compileClasspath += configurations.provided
    test.compileClasspath += configurations.provided
    test.runtimeClasspath += configurations.provided
}

dependencies {
    provided("org.apache.spark:spark-core_2.12:3.1.1")
    provided("org.scalactic:scalactic_2.12:3.2.10")
    constraints {
        implementation("org.scala-lang:scala-library") {
            version strictly: "2.12.10"
            because "it's provided with Spark"
        }
        implementation("org.scala-lang:scala-reflect") {
            version strictly: "2.12.10"
            because "it's provided with Spark"
        }
    }
}

Here’s a project that would compile.

And the problem with it is what?
You originally said something about delta-core but there is no delta-core anymore.
If you mean why you get 2.12.15 in provided, well, why not.
Your provided has no relationship to implementation, so there is no conflict resolution or similar be done.
You declare org.scalactic:scalactic_2.12:3.2.10 on provided which transitively dependson 2.12.15.
The constraints you declare are for implementation and configurations inheriting from it, but have nothing to do with provided.
And you add provided as is after resolution to your classpaths.
So 2.12.15 naturally makes it there.

And what are the bad practices here?

Well, as you asked for it here some, probably not all, don’t complain :smiley: :

  • Totally subjective personal one: Do not use Groovy DSL, but Kotlin DSL. You get type-safe build scripts, much better error messages if you mess up, amazingly better IDE support with proper auto-completion, …
  • Do not use script plugins (apply from:) but better use precompiled script plugins, especially if you follow the Kotlin DSL advice. When using Groovy DSL it is somewhat ok to keep them.
  • Do not use allprojects { ... } or subprojects { ... } they introduce project coupling and thus disturb more sophisticated Gradle features like parallel execution or configuration cache, and they also make the builds harder to understand and harder to maintain. Better use convention plugins, for example in the form of precompiled script plugins, that you then apply directly where you want to have their effect applied.
  • Do not use the legacy apply to apply plugins, but the plugins { ... } block.
  • Do not use explicit dependsOn except for lifecycle tasks. Instead you should properly wire task outputs to task inputs and with that get the necessary task dependencies automatically.
  • The configuration default is deprecated.
  • Do not use task foo(...) it works against task configuration avoidance and thus wastes your time when running builds where the task is not executed, but for example use tasks.register instead.
  • Do not create a custom half-wired provided configuration, but simply use compileOnly for those dependencies
  • If you declare a custom configuation, either make it non-resolvable or non-consumable or both, depending on intended usage of the configuration, that both are true by default is legacy and just for backwards compatibility
  • Not sure what the configure(project) is for, I’d say it is superfluous from the guts
  • spark.core and so on look like you have defined some ext variables in one of the script plugins, instead use version catalogs for central declaration of dependencies and versions.
  • Actually almost any usage of ext is a code smell and most probably a better solution exists

Found the problem btw. It should have been version { strictly "2.12.10" } instead of version strictly: "2.12.10"


As for bad practices, thanks.

Only in some cases gradle’s internal decisions are unfathomable and overcomplicated. For instance, compileOnly dependencies are invisible in tests code, so I have to repeat the same dependency again with testCompileOnly OR create half-wired “provided” that actually does the job that maven and sbt can do out of the box.

Well, both has pros and cons.

If you have a compileOnly dependency it might be a good idea that your tests do not see that dependency and you test that everything works without it being in the class path.
And maybe you should instead have another test task that includes that dependency to test how it behaves with the dependency or to write tests for code depending on that dependency and so on.

Most often compileOnly dependencies are just annotations that are just not necessary at runtime but that a consumer can use if he has it in the class path.

Also if that is what you want, I think you can easily make configurations.testCompileOnly.extendsForm(configurations.compileOnly) which would care for all dependencies declared on compileOnly also being available on testCompileOnly.

Actually the java-library plugin does that for compileOnlyApi (configurations.testCompileOnly.extendsFrom(configurations.compileOnlyApi)), but if you declare your dependency on compileOnlyApi it will also be available on the compile classpath of downstream projects which might not be what you want.