How to add custom properties to dependency configurations?

We are working on providing a custom dependency configuration for an in-house language (OpenEdge ABL). The requirement is that we need a property that will be used while using the artifact defined by this dependency configuration.

Example:

plugins {
  id "progress.openedge.abl"
}

dependencies {
    implementationAbl ('<group>:<name>:<version>:<classifier>@pl') {
        packageType = "someValue1"
    }

    implementationAbl ('<group>:<name>:<version>:<classifier>@pl'){
        packageType = "someValue2"
    }
}

The plugin will create implementationAbl as a configuration and then customers can use it to define their dependencies (an archive with .pl extension).
But we are not able to find a way to define the property packageType that can be set in the build script and then used while using it. The problem is that this property should be a member of the configuration but we don’t really have much control over the responsible classes. I didn’t even find a solution to extend/implement those classes.

I saw some related questions on the Gradle forum but mostly they are not answered. Can someone please help me here? Is this something that can be achieved? Is there any better way to achieve this? (It will be okay to add this property inside implementationAbl { artifact{} } as well)

Thanks,
Rahul

Are your artifacts coming from Maven?

Can you explain how thispackageType value will be used?

Also, see this:
https://docs.gradle.org/current/javadoc/org/gradle/api/artifacts/dsl/DependencyHandler.html

This shows a block artifact { } within the dependency configuration block which might have the attributes you are looking for.

Thanks @EarthCitizen for the reply.

Yes, the artifacts can come from any repository (Maven, etc) or could be pointing to a local file. Consider it similar to implementation configuration provided by the java plugin.

The value of the packageType attribute could be something like source/binary and the behavior will be,

  1. if the value is binary, the artifact will be directly added to the Propath (similar to Java’s Classpath) of the ABL runtime
  2. otherwise if the value is source, we need to extract the artifact (.pl archive) and then add it to the Propath of the ABL runtime.

This property is needed mainly to support the 2nd option.

I am aware of the artifact {} block but I don’t think we can use any of those properties because they have different meanings and we expect users to use them for defining their artifacts. If we can somehow add an additional property either to implementationAbl { } or implementationAbl { artifact{} } block, then our problem will be solved.

This more sounds like you want an artifact transform. The artifact transform will then do the unpacking as needed.

1 Like

Thanks @Vampire, artifact transform is exactly what I was looking for.

We decided to create 2 different configurations, and apply transform for the one which we want to be extracted.

There is another issue that we are facing now while resolving the configurations. So consider configurations implementationAblSource and compilePropath, where compilePropath extends implementationAblSource.

Now, implementationAblSource is transformed (extracted) and when resolving this configuration i.e. using implementationAblSource.files is returning the transformed folders, but compilePropath.files is still returning the files and not the transformed folders. Is this expected behavior or am I missing something?

Sample script used:


import org.gradle.api.artifacts.transform.TransformParameters
abstract class Unzip implements TransformAction<TransformParameters.None> { 
    @InputArtifact                                                          
    abstract Provider<FileSystemLocation> getInputArtifact()

    @Override
    void transform(TransformOutputs outputs) {
        def input = inputArtifact.get().asFile
		println "input $input"
        def unzipDir = outputs.dir(input.name)  
		println "unzipDir $unzipDir"                            
        unzipTo(input, unzipDir)                                            
    }

    private static void unzipTo(File zipFile, File unzipDir) {
        // implementation...
    }
}

def processed = Attribute.of('processed', Boolean)
def artifactType = Attribute.of('artifactType', String)

configurations{
    implementationAblSource{
		canBeResolved=true
		attributes.attribute(processed, true)
	}
	compilePropath{
		canBeResolved=true
	}
	compilePropath.extendsFrom(implementationAblSource)
}

dependencies {
	attributesSchema {
        attribute(processed)
    }
	artifactTypes.create("pl")
	artifactTypes.getByName("pl") {
        attributes.attribute(processed, false);
    }
    registerTransform(Unzip) {
		from.attribute(processed, false).attribute(artifactType, "pl")
        to.attribute(processed, true).attribute(artifactType, "pl")
    }
}

dependencies {
    implementationAblSource ('<group>:<module1>:<version>:<classifier>@pl')
    implementationAblSource ('<group>:<module2>:<version>:<classifier>@pl')
}

println configurations.implementationAblSource.files //prints the transformed folders
println configurations.compilePropath.files //prints the module files

Afair yes, extendsFrom just inherits the dependencies declared on the extended configuration, but not the attributes. So in your compilePropath you don’t request processed to be true and so your transform is not triggered.

Hmm, then there is no other option than creating a parent configuration with the correct attributes set. Thanks.

One more question, is there a good way to pass an object of project to transform class, say is declared in a separate class file? Passing the project object as a parameter is not possible due to some serialization issue.

Could not serialize value of type DefaultProject

I could get around this by using some static object of project, but just wondering if there is a better way.

(My apologies for these many follow up questions, but this (transform) is not that trivial and I couldn’t find a lot of examples on this topic)

No, you did not get around it, you just made sure that you will run into serious problems in the future. Don’t do that. What do you need the project for? Also doing anything with static state is bad. Even using some library or utility that use static state is bad as long as you don’t properly isolate it. The static state leaks into later builds on the same daemon, and even cross-project.

There are 2 reasons for which project is needed:

  • To get one of our Extension that is added to the project, which in turn is needed to ensure some Ant dependencies (taskdefs) are added
  • And the 2nd reason is to use some of those taskdefs, so basically, AntBuilder is needed for that. These taskdefs are used to extract the archive (.pl) files that I have talked about. PL files are our custom archive format and I have to use the Ant taskdefs to extract.

I did consider some of the challenges that you pointed out, but thought should be fine as I am only using Extension and the AntBuilder from the project. Though more I think the issues you pointed out, it doesn’t seem to be a good idea.

So what can be an alternative?

For point one, just give only the extension or if possible just the information of the extension you actually need, optimally wired by wiring the properties.

For point two there is no good way yet I think as the AntBuilder is not injectable. But you can use the Groovy own ant builder probably. See here for more information: Provide a public equivalent for IsolatedAntBuilder · Issue #1157 · gradle/gradle · GitHub

Thanks @Vampire. I will give it a try.

Though, this seems like a limitation in Transform. I mean the project object should have been available (as the Transform is registered to a project) or there should have been a way to at least pass some objects as constructor arguments while registering the Transform.

There are various places where the project is not available due to isolation and other reasons.
For example in worker api actions.
Or even at runtime in task actions when using the upcoming configuration cache (the documentation of which also gives advices on what to use instead for various things from the project instance).

Thanks @Vampire.

I passed the needed ANT dependency as a String Parameter and when transforming I am creating a new AntBuilder every time.

@Override
void transform(TransformOutputs outputs) {
	def input = inputArtifact.get().asFile
	println "input $input"
	def unzipDir = outputs.dir(input.name)
	println "unzipDir $unzipDir"
	
	AntBuilder ant = new AntBuilder()
	ant.setSaveStreams(false) //based on your suggestion
	ant.taskdef(resource: '<resource-properties-file>', classpath: parameters.<antDepLocation>, loaderRef: '<ref>')

	unzipTo(input, unzipDir, ant)
}

private static void unzipTo(File zipFile, File unzipDir, AntBuilder ant) {
    // implementation...
}

Do let me know if you have any other suggestions on how to create the AntBuilder. I didn’t see much point in creating a ThreadLocal as I believe the transform is already happening in different threads (based on the logs that I saw) so there will be multiple instances of AntBuilder anyway. And also since it is local and used only once.
Only thing that would have been great is if it was possible to have one instance of AntBuilder for the whole build cycle but I guess that’s not quite possible as of now.

Now that I finally have an end-to-end working, I have one more question, is it possible to have dependency declarations configure various parameters of the transform differently? For example:

abstract class Unzip implements TransformAction<Parameters> { 
    interface Parameters extends TransformParameters {
		@Input
		String getAntDepLocation()
		void setAntDepLocation(String loc)
		
		@Input @Optional
		String getEncoding()
		void setEncoding(String encoding)
	}

    @InputArtifact                                                          
    abstract Provider<FileSystemLocation> getInputArtifact()

    @Override
    void transform(TransformOutputs outputs) {
        //impl                                        
    }
}

dependencies {
    implementationAblSource ('<group>:<module1>:<version>:<classifier>@pl') //module1
    implementationAblSource ('<group>:<module2>:<version>:<classifier>@pl') //module2
}

while transforming module1, set encoding=‘UTF-8’
but while transforming module2, set encoding=‘ISO-8859-1’?
(This encoding will be used while extracting the PL file - ANT’s copy. I asked this as an example. I will ideally take a Map as input and pass it on to the ANT)

You can see above, how I’m registering the transform. Can I register the transform or at least set some parameters differently for different dependency declarations (without creating more configurations)?

I passed the needed ANT dependency as a String Parameter and when transforming I am creating a new AntBuilder every time.

Maybe you should consider instead passing it as file collection that you annotate with @Classpath.

I didn’t see much point in creating a ThreadLocal as I believe the transform is already happening in different threads (based on the logs that I saw) so there will be multiple instances of AntBuilder anyway. And also since it is local and used only once.

I don’t think they are each in their own thread. Probably running in a pool of worker threads just like the worker api items too.
If you use a local variale to create the ant builder that’s fine of course.
But for me the problem was the I used it to jar-sign 50 JARs and that the RAM was busted by creating a new ant builder for each task, so I created it in the thread local.
I also don’t remember why I needed the save streams option, might as well be just because I reused the ant builder, but I really don’t remember.

Only thing that would have been great is if it was possible to have one instance of AntBuilder for the whole build cycle but I guess that’s not quite possible as of now.

Afair it is not thread-safe, so that will not work. And even if it were, it might make things run sequentially that could be run in parallel. But well, that’s the reason I had the thread local, to have as many ant builders as necessary to run in parallel, but as few as possible.

I will ideally take a Map as input and pass it on to the ANT

Well, if that is your ideal, do it like that.
Make a map and give it as parameter to the transform.

Probably another option would be that you add another attribute like transformedWithEncoding, then you register the artifact transform multiple times, once for each encoding, and request that attribute for the concrete dependency instead of for the whole configuration, something like that.

1 Like

to have as many ant builders as necessary to run in parallel, but as few as possible.

Hmm, then it makes sense to use a ThreadLocal.

request that attribute for the concrete dependency instead of for the whole configuration, something like that

The last question, I’m not quite sure how to do this? How to register the transform on some particular dependencies? Can you give me some examples or point to documents. I didn’t find much on this.

In a nutshell, the question is how do I register transform with transformedWithEncoding=UTF-8 for dependency 1 but transformedWithEncoding=ISO-8859-1 for dependency 2,

dependencies {
    implementationAblSource ('<group>:<module1>:<version>:<classifier>@pl') //dependency 1
    implementationAblSource ('<group>:<module2>:<version>:<classifier>@pl') //dependency 2
}

Thanks again. The above suggestions have helped me a lot.

The last question, I’m not quite sure how to do this?

Which part my description did you not understand?

How to register the transform on some particular dependencies?

You don’t, re-read my explanation please

Can you give me some examples or point to documents. I didn’t find much on this.

Don’t have any, but it should actually be pretty straight-forward.

In a nutshell, the question is how do I register transform with transformedWithEncoding=UTF-8 for dependency 1 but transformedWithEncoding=ISO-8859-1 for dependency 2,

You can try like I suggested in the last paragraph of my last post.
Never tried that before, but I think it should work.

What do you mean by this? If you mean by requesting attribute on the dependency declaration like this,

dependencies {
    implementationAblSource ('<group>:<module1>:<version>:<classifier>@pl'){ //dependency 1
        attributes{
            attribute(transformedWithEncoding, "UTF-8")
        }
    }
}

then unfortunately this doesn’t work and I get this error:
> Cannot add attributes or capabilities on a dependency that specifies artifacts or configuration information

Oh, sad :frowning:
Yes, that was what I meant.
Seems this doesn’t work with @pl which is the mentioned “specifies artifacts”.

Then you maybe indeed need one configuration per encoding,
or need to supply the mapping via map or whatever to the transform.