Writing a Gradle plugin that transform classes

I’m writing a Gradle plugin that, when applied on a JavaCompile or KotlinCompile task:

  1. Reads compiled classes (.class) from sourceSet.output.classesDirs, source sets are selected by the user through a configuration key
  2. Performs some bytecode transformations
  3. Write transformed classes into a separate build/transformedClasses directory
  4. Have subsequent tasks, i.e. a Jar task, take build/transformedClasses instead of build/classes as input, seamlessly for the plugin consumer.

I’ve managed to program 1 to 3, but I don’t know how or if 4 can be done.

The pseudo-code I have right now is:

tasks.withType(JavaCompile) { compileTask ->
    tasks.register("transform" + compileTask.name, MyTransform::class.java) {
        dependsOn(compileTask)

        inputSourceSets.set(sourceSets)
        sourceSetsToTransform(extension.selectedSourceSets)
    }
}

As an output, what I’m trying to achieve is to have

  • original build/classes compiled classes as a reference
  • transformed classes in build/transformedClasses and used for whatever comes next.

Is there a canonical way of doing such a thing or other projects that achieve something similar? I have had some success by changing the java.destinationDirectory and the compiledBy values of the selected source sets, but that does not allow me to keep both the original and transformed classes.

Let’s start with your snippet.
Doing tasks.withType(...) { ... ] is highly discouraged as is breaks task-configuration avoidance for all tasks of that type.
Instead usually tasks.withType(...).configureEach { ... } should be used which happens lazily and only for the tasks realized anyway.

On the other hand, within the configureEach you cannot register additional tasks, but you shouldn’t do that from the configuration of another task anyway.
Instead if you want to register a task for the compile tasks of all source sets, do a sourceSets.all { ... } or similar. There might for example also be custom JavaCompile tasks a user of your plugin added that he does not want your logic to apply to.

To point 4, do you really only need that specific Jar task to include those transformed classes? Or should instead any consumer of the compilation result use the transformed classes? Usually the latter is what you want. And then it is a pretty bad idea to do it like that with a separate task, as you probably never will catch all tasks and things you need to reconfigure, and you maybe even don’t recognize it.

From what you descibed, and assuming the latter for the last paragraph’s question, I would probably add a doLast { ... } action to the compile tasks in question. In that action I would copy the originally compiled classes to some place where you have them “as a reference”, and then do the bytecode manipulation in-place.

Thank you for your prompt and insightful response!

Indeed, for 4, I do need “any consumer of the compilation result to use the transformed classes”. I used tasks because said transformations are controlled by several parameters provided through an extension and by the user. So it seemed fitting and convenient to use @get:Input-like properties to make it incremental, i.e. out of date on configuration changes.

If I understood your points correctly, I should have written something like:

project.tasks.withType(JavaCompile::class.java).configureEach {
    doLast {
        sourceSets.filter { ... }.forEach { source ->
            // walk source.output.classesDirs and do stuff to the bytecode
        }
    }
}

I’m unsure how I can make this depend on my extension properties. Should I add inputs to each JavaCompile tasks, somehow?

So it seemed fitting and convenient to use @get:Input -like properties to make it incremental, i.e. out of date on configuration changes.

You can add all those by runtime api too, using inputs.file, inputs.files, inputs.properties, …, same for outputs. And you indeed should properly declare the inputs and outputs, always.

If I understood your points correctly, I should have written something like:

Not really. You should not do anything with sourceSets in the task action.
If you want the users to be able to configure the sourceSets then add a function handleSourceSet to your extension that gets a source set as argument to a parameter, then in the function body get the compile task(s) for that source set and add the doLast action exactly to those compile tasks.

Like you showed it, you would for any JavaCompile task walk through all configured source sets and modify their output classes which would be extremely bad, as for one it does not make any sense at all, and more importantly all compile tasks would now have overlapping outputs and not even declare it.

does this implement something similar?

Only very slightly and very bad practice how it does what it does.
That plugin does not modify the compilation result, it tries to transform the classpaths of all JavaExec tasks and all source sets.
But it is full of problematic code.
For example it breaks task-configuration avoidance for all JavaExec tasks, it uses afterEvaluate to change some configuration which can easily be tricked accidentally or intentionally, and some other quite bad or strange things.

That plugin should most probably simple be implemented as an artifact transform.

That helped immensely. I’ve implemented something resembling what you suggested, and it works just as intended. Thank you so much!

For the source sets selection, I resorted to computing a shouldTransform boolean that is true when the compile task being configured is matched in any of the involved source sets.

So, in short, it does:

project.tasks.withType(JavaCompile::class.java).configureEach {
    defineInputs(...) // inputs.file, inputs.property, etc
    val shouldTransform = sourceSets.any {
        it.getCompileTaskName(language) == name && extension.selectedSourceSets.get().contains(it.name)
    }
    if (shouldTransform) {
        doLast { ... }
    }
}

I tried writing a method in my extension that does handleSourceSet(set) -> JavaCompile but got a few issues, is my approach bad practice?

I have one more question I’m not sure I found the answer to: since my plugin is designed to be configured on JavaCompile, KotlinCompile, AppPlugin, and LibraryPlugin (the last two from AGP), how are my dependencies used in a consumer’s classpath? Should I try to have them as compileOnly dependencies and let the magic operate, or use implementation and carry everything with it?

i.e.:

dependencies {
    compileOnly("com.android.tools.build:gradle:8.5.0")
    compileOnly("org.jetbrains.kotlin:kotlin-gradle-plugin:2.0.20")
    compileOnly("org.jetbrains.kotlin:kotlin-gradle-plugin-api:2.0.20")
}

Thank you again for your time :pray:

I tried writing a method in my extension that does handleSourceSet(set) -> JavaCompile but got a few issues, is my approach bad practice?

Well, it does not really work properly / reliably. For example if task-configuration avoidance is broken you might iterate over the source sets before the extension was configured properly.
You could theoretically do that filtering within the execution phase, but only as long as do not plan to be configuration cache compatible in which case you could not access sourceSets anymore.

Also using sourceSets.any has a similar problem, if source sets are added after that code was executed.

So all in all, I’d say yes, your approach is bad practice and flaky at best.

Which problem did you have with the function-in-extension approach?
Should be something like

abstract class MyExtension : ExtensionAware {
    @get:Inject
    abstract val tasks: TaskContainer

    fun transform(sourceSet: SourceSet, language: String) {
        tasks.named(sourceSet.getCompileTaskName(language)) {
            doLast {
                ...
            }
        }
    }
}

how are my dependencies used in a consumer’s classpath? Should I try to have them as compileOnly dependencies and let the magic operate, or use implementation and carry everything with it?

That highly depends on the details and is not really answerable thus generically.
Like with any JVM project, if you have them as compileOnly you depend on the classes being present in the target classpath, with implementation you bring them in yourself.

If your plugin does not require those plugins, but can react to them being applied (using pluginManager.withPlugin(...) { ... ]) then you usually only want a compileOnly dependency to not bring those plugins to the classpath of your consumers needlessly, making things slower and so on.

If your plugin must have those plugins available and for example also applies them, then it is most often better to also depend on the plugin, but there might also be cases where this is not good. :man_shrugging:

If your plugin does not require those plugins, but can react to them being applied

Yes! That is my use case, so I’ll move to the pluginManager.withPlugin API.

Also using sourceSets.any has a similar problem, if source sets are added after that code was executed.

Well, that’s what I am now confused about. Where am I supposed to cycle through sets and filter based on user input so as not to miss any?
i.e. when my plugin is applied, this is wrong:

project.pluginManager.withPlugin("java") {
    val sets = project.extensions.getByName("sourceSets") as SourceSetContainer
    sets.forEach { set ->                //  WRONG: would miss source sets added after code was executed 
        extension.transform(set, "java") // inside: only doLast if set in extensions.selectedSets
   }
}

But this is bad as well:

project.pluginManager.withPlugin("java") {
   extension.selectedSets.get().forEach {   // not evaluated, selectedSets.get() returns its default value
       extension.transform(it, "java")   
   }
}

Don’t at all, use a function in the extension as I recommended :slight_smile:

Oh my god, I just got what you meant. I was so far from considering that as an option :sweat_smile:
I will explore that thank you!

However, this needs something extra in order to code the convention: “If transform is not called, call transform("main", "java") as a default” right? I bet I could do something like

override fun apply(project: Project)
    project.afterEvaluate {
       extension.ensureConfigured()
    }
}

afterEvaluate is a very bad idea. It is highly discouraged. The main gain you get from using it are timing problems, ordering problems, and race conditions. It is like using SwingUtilities.invokeLater or Platform.runLater to “fix” a GUI problem. It just does symptom treatment, delaying the problem to a later, harder to reproduce, harder to debug, and harder to fix point in time.

One option would be to just not have a default and if the user does not call the method, safely assume that he does not want to transform anything at the present time under the current circumstances.

If you really want to have a protection, you could also for example have some runtime check for example as an own task that has some boolean set that is also set by that method and then fails the build if the method was never called hinting the user at the need to call it. :man_shrugging:

I’m marking this as resolved because I now have everything I need to create something great. I’ve learned so much more in a week here than I did in the previous month. I’ve got to say, though, as a newcomer, it’s so easy to write something that seems to work but is very nonsensical :sweat_smile:

Thanks for helping me improve :wave:

1 Like