Plugin Portal & Improving Plugin Development

summary: Making it easier to retrieve, publish, host, test and document plugins and providing an online service for hosting Gradle open source plugins. status: Discussions are happening and work will start not long after the release of 1.0 code: planned

The plugin portal will enable Gradle users to more conveniently reuse plugins and scripts developed by other Gradle users. The portal will act as a database, with an interface to search for and find out about the plugins that are out there. Other planned features:

  • Hosts multiple versions of the same plugin * Runs nightly automated tests to validate which versions of a plugin works with which version of Gradle (including the latest Gradle from trunk). * Provides a staging workflow for publishing new plugins (Do they provide documentation, integration tests, …) * Gradle would also provide a Plugin Development plugin that makes it very easy to write and publish plugins. * The source and the actual binaries of the plugins can live wherever the author wants them to live.

This will include a set of Gradle plugins that make it easier to retrieve, publish, host, test and document plugins. This won’t depend on the host which might be the official Gradle open source plugin portal but can also be any internal enterprise repository manager.

This is at the top of my personal wish list for Gradle. These are my primary pain points: “…retrieve, publish, host …”.

I’m keen to work on these issues and contribute improvements. Finding the best way to fit into the roadmap and agree on the correct direction for Gradle have been the main blockers to date.

GRADLE-1677 was a step towards the retrieve portion, as was GRADLE-1653. These two mailing list discussion threads relate to GRADLE-1677:

I’d like to revive the discussion around the retrieval part of this roadmap item. Is this forum topic a good place for it?

yes it is

@adam Great!

Here’s my recap of this thread

In terms of the user experience, we settled on this:

"Being able to inline the dependency declaration into the apply statement is something we've discussed before. The plan is to have the apply statement handle dependency coordinates. You would be able to do any of the following:
  1. apply group: ‘mygroup’, name: ‘myplugin-project’, version: ‘1.0’, plugin: ‘myplugin’ 2. apply group: ‘mygroup’, plugin: ‘myplugin’, version: ‘1.0’

  2. apply plugin: ‘myplugin’"

We then came up with a technical solution:

"I think there are really 2 options to solving this: 1. Limit the places where this type of apply statement can appear in the script, eg it must be a top-level statement. 2. Don't even try to support changing the build script classpath.

I’d go with option 2. The apply method would work something like this: 1. create a new configuration containing the plugin dependency, and resolve the configuration. 2. create a URLClassLoader from the resulting files. The parent ClassLoader should be GradleInternal.getScriptClassLoader().

  1. use the ClassLoader to look up the plugin.

Nice and simple. It might be good to have a cache of classloaders created in step 2, so we can share the implementation classes across all projects in a build (and, when using the daemon, across builds).

What is interesting about this approach is that plugins are isolated from each other. The big downside is that the plugin cannot contribute classes to the build script. But I think it might be a good thing to move away from needing to do this."

During the discussion of related issues to the technical solution we stalled here:

"The more I consider this, the more it just feels wrong (by “this” I mean not exposing types from plugins to the build file compile). I can see it being a source of much confusion and annoyance to users. Not having types as first class citizens has an unclean feel to it and seems like a compromise which has the potential to turn people off. I think we need to keep searching for a cleaner solution."

What would help us proceed?

Something that caught my eye from the other thread was the wish to be able to do something akin to this also:

apply plugin: 'company-plugin', fromJar: 'url://to/company-plugin.jar'


Although for my taste anything we do needs to be compatible with working without a network connection.

I had suggested that, but I think I can live with using apply from.

This seems to work for me:

Remote Script:

buildscript {
   repositories {
      mavenRepo urls: ['']
   dependencies {
      classpath ''
  apply plugin: 'company-plugin'

Local Script (Single Project):

apply from: 'file:///Users/elberry/development/projects/mine/test/remote.gradle'

Local Script (Multiple Project):

subprojects {
   apply from: 'file:///Users/elberry/development/projects/mine/test/remote.gradle'

This seems to solve my issues, and seems to work fine on 1.0M3. I think there were some drawbacks to this though, and I haven’t tested it with 1.0M5 yet.

This no longer seems to work for me in milestone-5.

There is an existing bug for this however: GRADLE-1517

Wouldn’t it be best if there was a central repository hosted by gradle where all the plugins could live. This would make using them much easier in the long run since we would not have to track the location of each plugin, but can simply rely on the central repository.

This is exactly the idea of the plugin portal. It’s a central repository that you point your build at to find plugins.

This repository may or may not actually host the plugins. It may instead simply point Gradle at the real location of the plugin. Either way, the build author does not need to know where the plugin happens to live, Gradle + the plugin portal will take care of this.

At this stage, I think the preference is that the portal will physically host the plugins.

We use artifactory to proxy external repositories to protect us against outages and to increase speed for the team. If the plugins where not in a central maven compatible repository but spread out over the net we could not proxy them with artifactory.

My intention for this is that if we do allow you to host your binaries anywhere, we will proxy it in some fashion so there is always a consistent URL scheme you can hit.

My preference at this stage is this approach, but at least initially it may be a requirement that we host the binaries.

I’m looking to work on this again next week (GRADLE-1677) at the local Hackergarten event. Are there any high-level decisions we need to make beforehand?

Quick recap:

"Being able to inline the dependency declaration into the apply statement is something we’ve discussed before. The plan is to have the apply statement handle dependency coordinates. You would be able to do any of the following:

  1. apply group: ‘mygroup’, name: ‘myplugin-project’, version: ‘1.0’, plugin: ‘myplugin’ 2. apply group: ‘mygroup’, plugin: ‘myplugin’, version: ‘1.0’

  2. apply plugin: ‘myplugin’ "

It would be nice to have a short notation, e.g. apply plugin: “mygroup:myplugin-project:1.0:myplugin” or apply plugin: “myplugin”, from: “mygroup:myplugin-project:1.0”

and perhaps a shorthand for specifying repository along with it. Eg.

apply plugin: "mygroup:myplugin-project:1.0", mvn: "http://host/repo"


apply plugin: "mygroup:myplugin-project:1.0", ivy: "http://host/repo/[artifact]-[pattern]"

@Merlyn: the build classpath needs to be constructed before any calls to apply(), so it’s not clear to me how we’d use the dependency coordinates so late in the lifecycle.

Apart from that problem, I like the notation.

Yes. That’s tricky. Either the apply function needs to be part of the first pass of the script (tricky/dangerous), or the dependency of the plugin is not shared (with the build script itself and the other plugins).

Last time we discussed this the sticking point was around making classes like Tasks available to the build script.

Treating these things like script plugins in that they’ll have their own isolated classloader hierarchy sounds like a good idea to start with. That should work.

I can see 4 broad strategies here:

  1. We extract all the calls to ‘apply’ in a pass that runs before we assemble the final script classpath, and use these calls to decide what to add to the script classpath. This may or may not be the same pass that evaluates the ‘buildscript { }’ blocks. 2. We use some mechanism for adding stuff to the script classpath that is separate from applying the plugin (this is the strategy we use now). 3. We get rid of the need for plugins to be visible on the script classpath. 4. Transform the build script so that unresolved static types are replaced with dynamic types, and have the ‘apply’ statements contribute stuff to the script classpath dynamically as they execute.
  1. is just too complicated to get right, except perhaps for top-level ‘apply’ statements early in the script (which they often are). So we could extract only a certain subset of ‘apply’ statements, for example, perhaps these are ok:

apply plugin: “group:module:version:plugin”

subprojects {

apply plugin: “group:module:version:other-plugin”


  1. doesn’t necessarily have to be as complex as it is now. For example, perhaps ‘apply’ statements in the ‘buildscript { }’ block contribute to the script classpath, and those outside the ‘buildscript { }’ block do not:

buildscript {

apply from: “group:module:version:plugin”

subprojects {

apply from: “group:module:version:other-plugin”



However, to me, 3) is the way to go, perhaps combined with 2) for those cases where it really does make sense.

So, given this, I would have ‘apply’ create an isolated classloader for the plugin, with some caching so that we reuse the classloader across build scripts.

Thanks, Adam.

So if we go with 3), how do we make plugin Types available to the build script? E.g. the Xslt task:

Would that be a scenario in which we would use 2) ?