Applying similar sets of dependencies to different source sets, using a custom plugin

Context

  • I am using Gradle 8.4.
  • The project is a multi-module project.
  • The project has several modules.
  • The modules have a lot of overlap in the structure of the build.gradle, and I have gotten them to the point where they are practically the same.
  • I have written custom plugins in Kotlin.
  • The plugins react to the application of other plugins.
  • One plugin in particular, creates and configures custom source sets. Right now, the source sets are created by using the NamedDomainObjectContainer method, and my build.gradle files have the relevant DSL block like so:
scalaVariants {
	create("2.12") // The plugin uses this to create two source sets, 'scala212' and 'testScala212', wire up the configurations, and tasks.
	create("2.13") // The plugin uses this to create two source sets, 'scala213' and 'testScala213', wire up the configurations, and tasks.
}

This is worked, however I am faced with a lot of duplication in my dependencies block. I have this for example:

scalaVariants {
	create("2.12")
	create("2.13")
}

ext {
	junitVersion = "5.10.1"
	// a bunch of other dependency version

	sparkVersion = "3.4.2"
	scalaBinaryVersion = project.findProperty("scala.binary.version")
}

dependencies {
	implementation("org.apache.spark:spark-sql_${scalaBinaryVersion}:${sparkVersion}")
	// other regular and scala flavoured dependencies affecting the 'main' source set

	testImplementation("org.junit.jupiter:junit-jupiter:${junitVersion}")
	// other regular and scala flavoured dependencies affecting the 'test' source set

	scala212Implementation("org.apache.spark:spark-sql_2.12:${sparkVersion}")
	// other regular and scala flavoured dependencies affecting the 'scala212' source set

	testScala212Implementation("org.junit.jupiter:junit-jupiter:${junitVersion}")
	// other regular and scala flavoured dependencies affecting the 'testScala212' source set

	scala213Implementation("org.apache.spark:spark-sql_2.13:${sparkVersion}")
	// other regular and scala flavoured dependencies affecting the 'scala213' source set

	testScala213Implementation("org.junit.jupiter:junit-jupiter:${junitVersion}")
	// other regular and scala flavoured dependencies affecting the 'testScala213' source set
}

Now this pattern occurs across all the modules where it has to be compiled using different Scala variants of Apache Spark.

So I did some research, and I came up with an approach I’d like to take. Namely, I want to have a “DSL” that looks like this:

scalaNg {
	defaultScalaBinaryVersion = "2.12"
	supportedScalaBinaryVersions = ["2.12", "2.13"]

	scalaImpl("org.apache.spark", "spark-sql", project.findProperty("spark.version"))
	testImpl("org.junit.jupiter", "junit-jupiter", project.findProperty("junit.version"))
}

And the plugin will then configure the dependencies for each source set. scalaImpl means that a “Scala Binary Version Aware” dependency should be created. The inspiration of this comes from sbt, which uses the %% operator to automatically “inject” the current Scala binary version into the artefact’s name, and it will place the dependency in the implementation, scala212Implementation, and scala213Implementation configurations. Similarly, the testImpl will do the same for the ‘test’ source sets.

The Problem

So I wrote a plugin (as we do), and whilst I get no compilation errors, when I check what configurations and source sets are present, its as if the plugin was never applied.

Here is my plugin code:

class ScalaVariantPluginNg : Plugin<Project> {
    private fun String.scalaSourceSetName() = "scala${this.replace(".", "")}"
    private fun String.testScalaSourceSetName() = "testScala${this.replace(".", "")}"

    override fun apply(target: Project) {
        target.plugins.withType<JavaPlugin> {
            val ext = target.extensions.create("scalaNg", VariantExtension::class.java)
            val sourceSetContainer = target.extensions.getByType<SourceSetContainer>()
            createSourceSets(sourceSetContainer, ext)
            configureConfigurations(target, ext)
            configureDependencies(target, ext)
        }
    }

    private fun createSourceSets(
        sourceSetContainer: SourceSetContainer,
        variant: VariantExtension
    ) = variant.supportedScalaBinaryVersions.map { scalaBinaryVersions ->
        scalaBinaryVersions.forEach { scalaBinaryVersion ->
            val sourceSetName = scalaBinaryVersion.scalaSourceSetName()
            val testSourceSetName = scalaBinaryVersion.testScalaSourceSetName()

            sourceSetContainer.create(sourceSetName) {
                java.setSrcDirs(listOf("src/main/java", "src/${sourceSetName}/java"))
                resources.setSrcDirs(listOf("src/main/resources", "src/${sourceSetName}/resources"))
            }
            sourceSetContainer.create(testSourceSetName) {
                compileClasspath += sourceSetContainer[sourceSetName].output
                runtimeClasspath += sourceSetContainer[sourceSetName].output
            }
        }
    }

    private fun configureConfigurations(target: Project, ext: VariantExtension) =
        ext.supportedScalaBinaryVersions.map { scalaBinaryVersions ->
            configureConfigurations(target, scalaBinaryVersions)
        }

    private fun configureConfigurations(target: Project, scalaBinaryVersions: List<String>) =
        scalaBinaryVersions.forEach {
            configureConfigurations(target, it)
        }

    private fun configureConfigurations(target: Project, scalaBinaryVersion: String) {
        val configurations = target.configurations
        val sourceSetName = scalaBinaryVersion.scalaSourceSetName()
        val testSourceSetName = scalaBinaryVersion.testScalaSourceSetName()

        val api = configurations.create("${sourceSetName}Api")

        configurations.create("${sourceSetName}ApiElements") {
            extendsFrom(api)
            isCanBeResolved = false
            isCanBeConsumed = true
        }

        val implementation = configurations.named("${sourceSetName}Implementation") {
            extendsFrom(api)
        }

        val runtimeOnly = configurations.named("${sourceSetName}RuntimeOnly")
        val compileOnly = configurations.named("${sourceSetName}CompileOnly")

        configurations.create("${sourceSetName}RuntimeElements") {
            extendsFrom(implementation.get(), runtimeOnly.get())
            isCanBeResolved = false
            isCanBeConsumed = true
        }

        configurations.named("${sourceSetName}CompileClasspath") {
            extendsFrom(compileOnly.get(), implementation.get())
            isCanBeResolved = true
        }

        configurations.named("${sourceSetName}RuntimeClasspath") {
            extendsFrom(implementation.get(), runtimeOnly.get())
            isCanBeResolved = true
        }

        val testImplementation = configurations.named("${testSourceSetName}Implementation") {
            extendsFrom(implementation.get())
        }

        val testRuntimeOnly = configurations.named("${testSourceSetName}RuntimeOnly") {
            extendsFrom(runtimeOnly.get())
        }

        val testCompileOnly = configurations.named("${testSourceSetName}CompileOnly")

        configurations.named("${testSourceSetName}CompileClasspath") {
            extendsFrom(testCompileOnly.get(), testImplementation.get())
            isCanBeResolved = true
        }

        configurations.named("${testSourceSetName}RuntimeClasspath") {
            extendsFrom(testImplementation.get(), testRuntimeOnly.get())
            isCanBeResolved = true
        }
    }

    private fun configureDependencies(target: Project, variant: VariantExtension) {
        val partitionedDependencies = variant.dependencies.map { deps ->
            deps.partition { it.configuration.startsWith("test") }
        }

        partitionedDependencies.map { (testDependencies, mainDependencies) ->
            val defaultScalaBinaryVersion = variant.defaultScalaBinaryVersion.get()
            applyDependenciesToMainSourceSet(target, mainDependencies, defaultScalaBinaryVersion)
            applyDependenciesToTestSourceSet(target, testDependencies, defaultScalaBinaryVersion)

            variant.supportedScalaBinaryVersions.get().map { scalaBinaryVersion ->
                applyDependenciesToMainScalaSourceSet(target, mainDependencies, scalaBinaryVersion)
                applyDependenciesToTestScalaSourceSet(target, testDependencies, scalaBinaryVersion)
            }
        }
    }

    private fun applyDependenciesToMainSourceSet(
        target: Project,
        mainDependencies: List<DependencySpec>,
        scalaBinaryVersion: String
    ) = mainDependencies.forEach { dep ->
        val dependencyNotation = resolveDependencyNotation(dep, scalaBinaryVersion)
        target.dependencies.add(dep.configuration, dependencyNotation)
    }

    private fun applyDependenciesToTestSourceSet(
        target: Project,
        testDependencies: List<DependencySpec>,
        scalaBinaryVersion: String
    ) = testDependencies.forEach { dep ->
        val dependencyNotation = resolveDependencyNotation(dep, scalaBinaryVersion)
        target.dependencies.add(dep.configuration, dependencyNotation)
    }

    private fun applyDependenciesToMainScalaSourceSet(
        target: Project,
        mainDependencies: List<DependencySpec>,
        scalaBinaryVersion: String
    ) {
        val scalaSourceSetName = scalaBinaryVersion.scalaSourceSetName()
        mainDependencies.forEach { dep ->
            val dependencyNotation = resolveDependencyNotation(dep, scalaBinaryVersion)
            target.dependencies.add(
                "${scalaSourceSetName}${dep.configuration.capitalize()}",
                dependencyNotation
            )
        }
    }

    private fun applyDependenciesToTestScalaSourceSet(
        target: Project,
        testDependencies: List<DependencySpec>,
        scalaBinaryVersion: String
    ) {
        val testSourceSetName = scalaBinaryVersion.testScalaSourceSetName()
        testDependencies.forEach { dep ->
            val dependencyNotation = resolveDependencyNotation(dep, scalaBinaryVersion)
            target.dependencies.add(
                "${testSourceSetName}${dep.configuration.capitalize()}",
                dependencyNotation
            )
        }
    }

    private fun resolveDependencyNotation(dep: DependencySpec, scalaBinaryVersion: String): String {
        return when (dep) {
            is DependencySpec.ScalaAwareDependencySpec -> {
                "${dep.group}:${
                    dep.name.replace(
                        "{scalaBinaryVersion}",
                        scalaBinaryVersion ?: ""
                    )
                }:${dep.version}"
            }

            is DependencySpec.RegularDependencySpec -> {
                "${dep.group}:${dep.name}:${dep.version}"
            }
        }
    }
}

sealed interface DependencySpec {
    val group: String
    val name: String
    val version: String
    val configuration: String

    class RegularDependencySpec(
        override val group: String,
        override val name: String,
        override val version: String,
        override val configuration: String
    ) : DependencySpec

    class ScalaAwareDependencySpec(
        override val group: String,
        override val name: String,
        override val version: String,
        override val configuration: String
    ) : DependencySpec
}


interface VariantExtension {
    val defaultScalaBinaryVersion: Property<String>
    val supportedScalaBinaryVersions: ListProperty<String>
    val dependencies: ListProperty<DependencySpec>

    fun add(dependency: DependencySpec) {
        dependencies.add(dependency)
    }

    fun impl(group: String, module: String, version: String) {
        add(DependencySpec.RegularDependencySpec(group, module, version, "implementation"))
    }

    fun scalaImpl(group: String, module: String, version: String) {
        add(DependencySpec.ScalaAwareDependencySpec(group, module, version, "implementation"))
    }

    fun compileOnly(group: String, module: String, version: String) {
        add(DependencySpec.RegularDependencySpec(group, module, version, "compileOnly"))
    }

    fun scalaCompileOnly(group: String, module: String, version: String) {
        add(DependencySpec.ScalaAwareDependencySpec(group, module, version, "compileOnly"))
    }

    fun runtimeOnly(group: String, module: String, version: String) {
        add(DependencySpec.RegularDependencySpec(group, module, version, "runtimeOnly"))
    }

    fun scalaRuntimeOnly(group: String, module: String, version: String) {
        add(DependencySpec.ScalaAwareDependencySpec(group, module, version, "runtimeOnly"))
    }

    fun api(group: String, module: String, version: String) {
        add(DependencySpec.RegularDependencySpec(group, module, version, "api"))
    }

    fun scalaApi(group: String, module: String, version: String) {
        add(DependencySpec.ScalaAwareDependencySpec(group, module, version, "api"))
    }

    fun testImplementation(group: String, module: String, version: String) {
        add(DependencySpec.RegularDependencySpec(group, module, version, "testImplementation"))
    }

    fun scalaTestImplementation(group: String, module: String, version: String) {
        add(DependencySpec.ScalaAwareDependencySpec(group, module, version, "testImplementation"))
    }

    fun testCompileOnly(group: String, module: String, version: String) {
        add(DependencySpec.RegularDependencySpec(group, module, version, "testCompileOnly"))
    }

    fun scalaTestCompileOnly(group: String, module: String, version: String) {
        add(DependencySpec.ScalaAwareDependencySpec(group, module, version, "testCompileOnly"))
    }

    fun testRuntimeOnly(group: String, module: String, version: String) {
        add(DependencySpec.RegularDependencySpec(group, module, version, "testRuntimeOnly"))
    }

    fun scalaTestRuntimeOnly(group: String, module: String, version: String) {
        add(DependencySpec.ScalaAwareDependencySpec(group, module, version, "testRuntimeOnly"))
    }
}

its as if the plugin was never applied

Not really, or you would not even have the scalaNg extension.

Your problem is, that you are for example treating supportedScalaBinaryVersions too lazily.
In createSourceSets you call map on it and get a provider back from that call.
But as long as you never query that provider, the map is never called.
This is not the standard Kotlin List#map, it is the Gradle Provider#map.

You probably want instead something like a DomainObjectSet or other DomainObjectCollection, so that you then can do an .all call on it to also trigger future additions.

It seems I’ve gone from one extreme (being too eager in my earlier iterations of my plugins) to another (being too lazy in this iteration). :joy:

Could you provide a starter or a link to a resource that provides more information on how I could use a DomainObjectSet or DomainObjectCollection to solve this issue?

I mean, in the plugin that came just before this one, I used a NamedDomainObjectContainer - but even then I had some problems with eagerness / laziness, if I tried to set properties on the extension.

Edit: As soon as I posted the response, I thought of something. Are you implying that my extension should have a DomainObjectCollection field on it?

Yes, instead of the ListProperty

All right. Following your advice, I managed to get the plugin working as expected! However, I noted one shortcoming, namely:

We can declare dependencies like this in a build.gradle file:

dependencies {
	implementation(project(group: ":shared", configuration: "scala212RuntimeElements"))
	testImplementation(platform("org.junit:junit-bom:5.10.1"))
}

How do i go about actually supporting these internal Gradle concepts in my own plugin, without actually coupling to the org.gradle.api.internal.artifacts.dependencies classes, and without reimplementing those things myself?

I’m not sure what you mean.
For what would you use internal classes?

Recall that I’m trying to reduce dependency duplication across my modules. My plugin provides and extension where each module can say, “These dependencies are common to all source sets”. However, I cannot (at this point in time) say, “use this module as a dependency” via my plugin. Concretely, I cannot do this:

scalaNg {
	impl(project(":shared"))
}

Nor can I do this:

scalaNg {
	impl(platform("org.junit", "junit-bom", "5.10.1"))
}

When I poked around the Gradle internals, I saw that it has the concept of Dependency (which is an interface), and a bunch of implementations that are in the internal package.

Essentially, I want to be able to use the Dependency interface (and implementations), in my own plugin, so I can benefit from the methods like project(), platform(), enforcedPlatform() etc.

Sure I would have to define those methods in my own extension, but I am hesitant to use Gradle’s internal classes, i.e., things that sit in the internal package and below.

I think one way would be to require your users to do

scalaNg {
	impl(dependencies.project(":shared"))
}

and having org.gradle.api.artifacts.Dependency / Provider<out org.gradle.api.artifacts.Dependency> as argument.

Another way would probably be to just have something like these in your extension:

abstract class EmpicSettingsExtension {
    @get:Inject
    protected abstract val dependencies: DependencyHandler
    fun platform(notation: Any) = dependencies.platform(notation)
}

Not directly related to your response, but I am still trying to develop an intuition around Gradle’s laziness and what things should be lazy versus what things should not be lazy.

The Code

Suppose I have this plugin

class DynamicSourceSetsPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        val objects = project.objects
        val builds = objects.domainObjectContainer(Build::class.java) { name ->
            objects.newInstance(
                Build::class.java,
                name
            )
        }
        project.extensions.add("builds", builds)
        createSourceSets(project)
    }

    private fun createSourceSets(project: Project) {
        @Suppress("UNCHECKED_CAST") val builds =
            project.extensions.getByName("builds") as NamedDomainObjectContainer<Build>
        builds.all {
            val sourceSetContainer = project.extensions.getByType(SourceSetContainer::class.java)
            val sparkVersionFmt = this.sparkVersion.get().replace(".", "")
            val scalaBinaryVersionFmt = this.scalaBinaryVersion.get().replace(".", "")
            val sourceSetName = "mainSpark${sparkVersionFmt}Scala${scalaBinaryVersionFmt}"
            val mainSourceSet = sourceSetContainer.create(sourceSetName) {
                val main = sourceSetContainer.getByName("main")
                java.setSrcDirs(main.java.srcDirs)
                resources.setSrcDirs(main.resources.srcDirs)
            }

            val testSourceSetName = "testSpark${sparkVersionFmt}Scala${scalaBinaryVersionFmt}"
            sourceSetContainer.create(testSourceSetName) {
                val test = sourceSetContainer.getByName("test")
                java.setSrcDirs(test.java.srcDirs)
                resources.setSrcDirs(test.resources.srcDirs)
                compileClasspath.plus(mainSourceSet.output)
                runtimeClasspath.plus(mainSourceSet.output)
            }
        }
    }
}

abstract class Build @javax.inject.Inject constructor(val name: String) {
    abstract val sparkVersion: Property<String>
    abstract val scalaBinaryVersion: Property<String>
}

The Problem

When I apply the plugin, and configure it like so:

builds {
    create("Spark 3.5.0 with Scala 2.13") {
        sparkVersion.set("3.5.0")
        scalaBinaryVersion.set("2.13")
    }

    create("Spark 3.5.0 with Scala 2.12") {
        sparkVersion.set("3.5.0")
        scalaBinaryVersion.set("2.12")
    }
}

I encounter the follow error:

Cannot query the value of property ‘sparkVersion’ because it has no value available.

I understand what the message is telling me, that sparkVersion and scalaBinaryVersion are not known during the evaluation and configuration phases of the project’s lifecycle and I am trying to access them in my plugin by using Property#get() before the value has been set. The thing is that I need those values in order to dynamically create my source sets.

Beyond using Project#afterEvaluate, I am not sure how to overcome this issue.

Side Note: If I swap builds.all for builds.configureEach, no observable changes happen.

Do you have any suggestions? Any solutions?

It’s not about evaluation or configuration phase, but simply about order.

The difference between all and configureEach is, that all causes all elements to immediately be realized and configured, while configureEach just schedules something to be done when and if the element is actually realized and needs to be configured.

So if you for example use configureEach and then in the build script do not use the eager create but the lazy register, then the action would not be executed unless you somewhere else cause the element to be realized.

all and configureEach actions are immediately done when the object is realized / created and before further configuration happens.
So when doing

create("Spark 3.5.0 with Scala 2.13") {
    sparkVersion.set("3.5.0")
    scalaBinaryVersion.set("2.13")
}

the object is created, then the all and configureEach actions that are already realized are executed and only then the configure closure you gave with the create call is exectued, which is the cause why the properties are not set when you try to get them.

Actually, it also does not really make any sense at all to make these fields Property, if you anyway immediately query them and not wire them to other Propertys. For such direct querying it is just clutter imho. Well, it might be more future-proof, for example if you also want to wire the values now or later or when a user wants to wire them.

Using afterEvaluate is not really helpful, it’s main effect is to introduce timing problems, ordering problems, and race conditions. What for example if the user then for whatever reason uses afterEvaluate to configure these values, then it will be after your action run and it also does not work anymore like expected.

One way to solve this would be to not directly register the NamedDomainObjectContainer as extension, but registering an own class as extension and in that extension have a function addBuild that has name, sparkVersion, and scalaBinaryVersion as parameters. In that function you can then do your source set name calculation and create your custom source sets as you directly have the values available.

If you still need that named domain object container for other purposes, you could for example create the elements also in that addBuild function in your extension and provide the collection as property of your extension.


Btw. here some boilerplate reduction if you like.
Build can simply be

interface Build : Named {
    val sparkVersion: Property<String>
    val scalaBinaryVersion: Property<String>
}

The factory you set is just what happens by default, so this is sufficient:

val builds = objects.domainObjectContainer(Build::class)

If you give builds as parameter to the createSourceSets function, you don’t need to get and unsafely cast it from project.

Thank you so much.

So basically, the object is created, the actions (the functions / closures) given to all and configureEach are called, and then, finally, the action that sets the properties is called.


Yes, I wanted to avoid afterEvaluate at all costs.


That actually unlocked a world of possibilities for me. I’m not sure why I was under the impression that an Extension object should not contain project configuration logic, and that it should be just a data container (possibly because all the examples in the Gradle manual treat it as such). However, implementing configuration logic on the newly created DynamicSourceSetsPluginExtension and the Build really, really helped.


This didn’t quite work for me. I encountered an error about getScalaBinaryVersion() cannot be abstract (or similar) - but its very minor and inconsequential.


Further thoughts and ponderings

I wonder if the DomainObjectCollection#whenObjectAdded methods could have solved this issue. However, I am not sure if the objects would have arrived in a “fully configured” state (that is the properties have been set).

So basically, the object is created, the actions (the functions / closures) given to all and configureEach are called, and then, finally, the action that sets the properties is called.

Exactly

Yes, I wanted to avoid afterEvaluate at all costs.

:ok_hand:

This didn’t quite work for me. I encountered an error about getScalaBinaryVersion() cannot be abstract (or similar) - but its very minor and inconsequential.

I tested it to be sure before I wrote it.
If you can provide an MCVE where you get that error, maybe I can see where the problem is.
Usually Gradle does add the necessary boilerplate.
Except maybe you are on an oooold Gradle version where the automatic decorations are not yet sophisticated enough.

I wonder if the DomainObjectCollection#whenObjectAdded methods could have solved this issue. However, I am not sure if the objects would have arrived in a “fully configured” state (that is the properties have been set).

I’m not fully sure from the top of my head, but I think you would have the exact same situation.