Applying similar sets of dependencies to different source sets, using a custom plugin

It’s not about evaluation or configuration phase, but simply about order.

The difference between all and configureEach is, that all causes all elements to immediately be realized and configured, while configureEach just schedules something to be done when and if the element is actually realized and needs to be configured.

So if you for example use configureEach and then in the build script do not use the eager create but the lazy register, then the action would not be executed unless you somewhere else cause the element to be realized.

all and configureEach actions are immediately done when the object is realized / created and before further configuration happens.
So when doing

create("Spark 3.5.0 with Scala 2.13") {
    sparkVersion.set("3.5.0")
    scalaBinaryVersion.set("2.13")
}

the object is created, then the all and configureEach actions that are already realized are executed and only then the configure closure you gave with the create call is exectued, which is the cause why the properties are not set when you try to get them.

Actually, it also does not really make any sense at all to make these fields Property, if you anyway immediately query them and not wire them to other Propertys. For such direct querying it is just clutter imho. Well, it might be more future-proof, for example if you also want to wire the values now or later or when a user wants to wire them.

Using afterEvaluate is not really helpful, it’s main effect is to introduce timing problems, ordering problems, and race conditions. What for example if the user then for whatever reason uses afterEvaluate to configure these values, then it will be after your action run and it also does not work anymore like expected.

One way to solve this would be to not directly register the NamedDomainObjectContainer as extension, but registering an own class as extension and in that extension have a function addBuild that has name, sparkVersion, and scalaBinaryVersion as parameters. In that function you can then do your source set name calculation and create your custom source sets as you directly have the values available.

If you still need that named domain object container for other purposes, you could for example create the elements also in that addBuild function in your extension and provide the collection as property of your extension.


Btw. here some boilerplate reduction if you like.
Build can simply be

interface Build : Named {
    val sparkVersion: Property<String>
    val scalaBinaryVersion: Property<String>
}

The factory you set is just what happens by default, so this is sufficient:

val builds = objects.domainObjectContainer(Build::class)

If you give builds as parameter to the createSourceSets function, you don’t need to get and unsafely cast it from project.

1 Like