Hi, I’m a developer of a Gradle plugin which uses TransformAction
s for dependencies. One of its users faced a problem using my plugin with the Shadow plugin: on the first build one of the dependencies on a subproject is empty. The second and subsequent builds are succeeded. I’m stuck investigating the problem, seems it’s a bug in Gradle. The minimal project with the issue is GitHub - AramMessdaghi9001/decoroutinator-fail: This is an example project for the decoroutinator gradle plugin not working. Will be glad for advice to localize the problem.
Hard to say where it comes from exactly, as that plugin’s build inherently broken and cannot even be synced to to IDE, besides that is loaded with bad practice over bad practice over bad practice.
But if you debug the failing test execution in the MCVE (which itself also is loaded with bad practice over bad practice) and have a look at the classloader URLs, you see that there is a ...transformed.../empty
and in turn the lib.jar
missing.
That most probably means that while transforming the lib.jar
, the lib.jar
was not yet present, which then went to the else
path in the transform which is a bug in itself. If you want nothing as result, don’t call any outputs
method. But if you hit nothing as input, then this is a bug somewhere. By handling the case gracefully you just hide that bug and make it harder to discover.
Most probably it is something like Configuration.getElements() eagerly triggers artifact transforms · Issue #27372 · gradle/gradle · GitHub or similar where either a Gradle bug or a plugin bug or a build script bug cause the artifact transform to be invoked before the producing task was run and once it was run, the result is fixed even if the artifact changes later in the build.
So even in the cases where the build does not fail, it sill behaves wrongly and produces wrong result, as it transforms the stale lib.jar
of the last execution and later on creates a new lib.jar
which is not used for consumers of the artifact transform, which could not happen if you did not gracefully handled the absence of the artifact but properly failed.
Thank you for your feedback.
Could you please give some examples of bad practices in the plugin. So far I realized that absence of the artifact has to be handled differently (how?). What are other examples of bad practices? May be you can provide some links to the documentation?
I realized that absence of the artifact has to be handled differently (how?).
In the transform?
Imho throw an exception as it only happens due to a bug or misuse and handling it gracefully just makes it worse as you will always work with stale artifact and sometimes even loose task dependencies by such bugs.
Just mentioning here for completeness as I already explained it in more detail on Slack.
Could you please give some examples of bad practices
You asked for it, so don’t complain.
I didn’t review your whole code, but just some points that directly jumped into vision.
For the MCVE project:
- why do you declare buildscript repositories, you don’t use them and usually shouldn’t either
- never do cross-project configuration (
allprojects { ... }
,subprojects { ... }
,project(...) { ... }
, …, that immediately introduces project coupling which which works against some more sophisticated Gradle features or optimizations, makes builds less understandable, and most often harder to maintain - don’t use legacy
apply...
method to apply plugins, but always theplugins { ... }
block - don’t break task-configuration avoidance unnecessarily like in
app/build.gradle:5
- more a personal recommendation: use Kotlin DSL instead of Groovy DSL, Kotlin DSL by now is the default DSL, you immediately get type-safe build scripts, actually helpful error messages if you mess up the syntax, and amazingly better IDE support if you use a good IDE like IntelliJ IDEA or Android Studio
For the plugin even more cursory:
- do not use some self-knit way to centralize version declarations, but use a version catalog, that is the built-in standard way and does not need 8 times the same parsing code, half of the time even using reflection for some reason
- if you insist on reinventing the wheel, at least fix its implementation, I couldn’t sync the project because it relies on the current working directory which no build must ever do. And build that does it is inherently broken as the current working directory is not guaranteed to be the project directory or any specific directory. Sometimes it is the project directory, sometime the daemon log directory, sometimes the IDE installation directory, … Code used in Gradle build logic must never ever rely on the current working directory which includes
File
constructor with relative path,FileInputStream
constructor with relative paths.
Actually almost no JVM program should ever use the current working directory, unless it is a CLI program that tries to resolve a relative path given by the caller on the commandline as that is practically the only case where you can safely assume that it is meant relative to the current working directory, or if you can control the working directory for the process with certainty. - what is the sense of
Project.stacktraceDecoroutinator
? If you just create the extension, you get such accessors already for free, and you even only get them if the plugin is actually applied and not only added to the classpath. If it is to support adding it to the classpath and configuring it insubprojects
orallprojects
blocks, this is highly discouraged bad practice anyway, so better not support it by quetionable extension functions - you are not Gradle, so better not provide code in the
org.gradle
namespace just to trick Gradle into adding your extension functions to the auto-imports, imho better let your users use an import if they want to use code from you, or use extensions for which automatically type-safe accessors are generated, the latter also has the advantage that it works the same, independent of DSL language the consumer is using and not only with Kotlin DSL consumers like currently - remove all the
DependencyHandler.decoroutinator...
extension functions. Instead add an extension with namedecoroutinator
to theDependencyHandler
s where you want it and the extension will then have functions or properties for the 4 packages, this way it will cleanly work in both DSLs and without polluting the Gradle namespace - do not use primitive types or similar for properties in extensions or tasks, but use the Gradle lazy types (
Property
,RegularFileProperty
,DirectoryProperty
,SetProperty
,MapProperty
,ConfigurableFileCollection
, …) those have many advantages, especially that they can be wired together and only evaluated at late as possible, optimally only ever at execution time, and they can also bear implicit task dependencies when used properly, and you can set conventions on them that are used in case the user did not configure an explicit value or decided to reset to the default value by setting tonull
- don’t use things like
hasPlugin
, such things require specific order of things and thus are prone to sometimes fail to give the correct result at configuration time, as the result could change after it was queried; if your plugin requires that another plugin is applied also, apply it; if your plugin wants to react to a plugin being applied and do something in case it is, usepluginManager.withPlugin(...) { ... }
- Never use
afterEvaluate
if you can avoid it at almost any cost. The main gain you get from using it is timing problems, ordering problems, and race conditions. There is almost always a better way to do something that is cleaner and more reliable. For example for something you only need at task execution time, have proper lazy properties, wire them together, and only get the value at execution time. Or if you need some information at configuration time, for example better have a function in the extension that gets the value as argument and do the configuration change in that function body, eventually preventing the method to be called multiple times. - Don’t use
configurations.all
or you force all configurations to be realized while Gradle would properly treat that container lazily like the task container with task-configuration avoidance, if you need such a logic like you have, better do something properly lazy likeconfigurations.matching { matcher.matches(it.name) }.configureEach { ... }
- Don’t use
tasks.withType(...) { ... }
or again, you break laziness, or better said task-configuration avoidance - for those tasks, wasting your time as they are always configured, whether they are going to be executed or not. Instead dotasks.withType(...).confiugreEach { ... }
. (also here you could usematching { ... }
unless you need that debug-log on non-match) - As said above already, don’t use
allprojects { ... }
and friends. - Don’t use
configurations.forEach
, besides that it also breaks the laziness like mentioned for.all
above, it also only works with the configurations already registered at the time you call that method. If you want to configure all configuration useconfigurations.configureEach { ... }
, then you also get all configurations registered in the future and don’t needafterEvaluate
orproject.state
or similar bad tricks - Same for the
variants.forEach
- Probably same for the
artifacts.any
- Consider whether your artifact transform can be cacheable
- Not Gradle, but
=>val suffix = root.name.lastIndexOf('.').let { index -> if (index == -1) "" else root.name.substring(index) } val newName = root.name.removeSuffix(suffix) + "-decoroutinator" + suffix
val suffix = root.extension.let { if (it.isEmpty()) it else ".$it" } val newName = "${root.nameWithoutExtension}-decoroutinator$suffix"
- Don’t use
outputs.dir("empty")
, if you don’t want output, just don’t call anyoutputs
class, and in that specific case, as said already, better throws an exception, if the artifact is not there, something is broken and gracefully handling it just makes it worse
After some investigation I have found the root cause of the absence of the transformed artifact: it’s a Shadow plugin bug.