Publishing to the Plugin Portal via Artifactory

We currently use Bamboo and Artifactory for our build and release management process. A release build is triggered in Bamboo and is published to a staging repository in our Artifactory instance. Once the staged release has been tested it’s then promoted to our release repository and to Maven Central and Bintray. This process worked well for Gradle plugin publication with the old Bintray-based approach. I’m now trying to see how the new Plugin Publish plugin-based approach fits in.

Is it possible to (ab)use the Plugin Publish plugin to publish the existing pom and jar, source, and javadoc artifacts that could be downloaded from Artifactory to the Portal?

1 Like

Hi Andy

We don’t have support for staging releases in the plugin portal at this point.

The publishing plugin should publish artifact jars that it finds in the expected locations under the build directory, so a possible way forward for your workflow would be to download them to the locations that they would have been built to. E,g, the main jar might go to build/libs/my-project-1.2.3.jar etc.

How do you do promotion of your existing artifacts to external repositories currently?

Interesting. Thanks for the suggestion. Is there a safe way to try this, without running the risk of uploading the wrong artifacts to the portal? Also, is there anything we can do to reuse the existing pom? I want to avoid having different poms and artifacts with the same Maven coordinates in different repositories.

It’s all done via Bamboo and Artifactory. There’s some more information in the Artifactory documentation. Beyond what’s described there was can also trigger promotion to Maven Central and Bintray directly from Bamboo. I believe that’s implemented as some custom functionality in our Bamboo instance. It’s on the Artifactory roadmap for official support too.

I am interested in this subject also. The open source project (, I work with use Jenkins at Cloudbees to build and then we declare a build to the final release for a version. Before, we could upload the binary from the build to bintray and it would then be available from the plugin portal (

With the new process, there is no way to publish binaries from the CI build. Unless we play the game were we download the build directory to a local machine and run the publish task. But this is cumbersome and not without risks that the binary from the official build is used.

The publish plugin needs some way to be configured to point to a specific plugin binary rather than assuming it is in some output folder of a build.

Not really at this point. Longer term we’ll implement some sort of staging workflow, or at the very least a sandbox plugin portal, but we’re not there yet.

Without actually running the command, you could print out the files associated with each archive to be uploaded, which are defined by the archives configuration:

afterEvaluate { project ->
    project.configurations.archives.allArtifacts.each { println it.file }

To ensure that you don’t upload a jar that you don’t want to get generated on that run, you could stick in a TaskExecutionListener that fails when the tasks are run. Something like:

gradle.taskGraph.whenReady {
    if (gradle.taskGraph.hasTask(":publishPlugins")) {
        def jarTaskNames = ['jar', 'sourcesJar', 'javadocJar']
        gradle.taskGraph.addTaskExecutionListener(new TaskExecutionAdapter() {
            void afterExecute(Task task, TaskState taskState) {
                if (jarTaskNames.contains( && !taskState.skipped) {
                    throw new IllegalStateException("Executed $ task, should be up-to-date")

While I understand wanting that, the pom is considered an implementation detail for plugin resolution. We use poms for dependency information more for compatibility with Bintray which we were using for hosting than anything else. The Gradle plugin repo shouldn’t be used by any other consumers, and we don’t really expect poms in there to be used for the more general metadata that people put in them for publishing to general public repos. We may even change the implementation to generate poms with the correct dependency information on the fly at some point. So we won’t be adding the ability to use a custom pom or customise the one that gets generated - we’d rather that people don’t think about there being a pom in there at all.

So is this done by renaming / republishing an existing non-final build? I don’t see what about your build wouldn’t work here, but I presume that there’s a detail that I don’t know yet. The build where the jar is built and the uploading / publishing happen at different times / places, is that right?

This is potentially a useful feature for people publishing to their own repos. How would you see it working for your case? Just define the dependency like you would in a dependencies block? We’d probably have to download the jar(s) locally and re-upload., and pass through dependency information from the downloaded pom or ivy descriptor.

That’s good news. Thanks.

This is starting to feel rather brattle and, particularly without any staging mechanism, it makes the risk of uploading a plugin to the Portal start to outweigh the benefits.

I don’t think the pom can be considered an implementation detail as it leaks out in some obvious places. It’s apparent from the m2 in the plugin repository’s url ( that Maven and therefore a pom is involved. Users can also see log messages for a pom being downloaded when the plugin isn’t already cached locally.

I also have a more general concern about what should be the same artifact being in different repositories with different metadata. If a user reports a problem with the plugin that could be caused by a faulty pom (perhaps a dependency is in the wrong scope), I don’t what to have to ask how the plugin was installed to know which pom will be involved.

Another point in favour of allowing plugin authors to provide their own pom is that it means that the automatic pom generation doesn’t have to understand things like jarjar. Perhaps it already does? However, providing an escape hatch that allows a user to specify their own pom would appear to be more robust as it gives the plugin author complete control when needed.

I think it’s a shame that the new approach replaced the old approach. While it is undoubtedly simpler, that simplification has come at a cost that’s too high for us. Could the old approach be run in parallel for plugin authors with more sophisticated requirements? It worked very well for us, providing all of the control and safeguards that are missing from the new plugin-based approach.

No. It is a normal build; any of which can be declared final after review of the build and agreement among the committers. After we declare a build final, we increment the version number in source control to being the next release cycle. For example, we are currently building version 3.0. Each build makes 3.0 versioned jars (the jar version is suffixed with the build timestamp also). One of the builds (when we are ready), will be declared the final 3.0 build and we will want to promote those built jars into the universe as the released 3.0 jars. We will then update the version in source control to 3.1 to start the next release cycle.

Well if a build on the CI system is declared the final build, I need to then get the built jar into repositories for consumers. With bintray and maven, I can publish that jar built in the CI build which was declared the final build (after it was built). Previously the gradle plugin portal used the jar from bintray, so this worked fine. With the new, plugin publishing plugin, I can’t publish a jar built elsewhere (like in my CI build) without some game playing or rebuilding (which defeats the point of declaring a build final after it is built).

Well, I don’t see it as part of a build. It is a discreet step after a build to promote a build artifact to a released status. I can work like this in bintray and maven (as we currently do). The bintray and maven models allow you to upload the artifact to a staging state, where it could be reviewed by others on the project, and then it is promoted to released.

The plugin publishing plugin model seems to encourage a work mode where either a developer builds and publishes from her workstation (which can result in binaries different than the CI built binaries) or having a special build on the CI system just to build and publish. This special build can have issues since it is infrequently used and thus can be out of date with the normal CI build.

On the “staging” front, couldn’t you make use to Bintray’s publishing mechanism that holds binaries in an unpublished state for 24 hours? It’s just a publish flag when you push JARs to Bintray. Surely that would work as a short term measure?

So are you saying that every CI build would upload to bintray with publish=0 and then if that build is deemed the final build, we would hit the Publish button on bintray? That seems overkill for each CI build to do that and it means you only have 24 hours to make the determination that the build is final. There is also the case of later builds overwriting the uploaded content before you make the determination.

My suggestion was more in response to Andy’s concerns about testing the publication without doing a full publish. If the plugin-publishing plugin were to have an option of publish=0, then Andy could safely test the publication mechanism without risking the release of a dud/broken/incorrect version.

It’s an interesting suggestion. Thanks, Peter. Unfortunately, I don’t really want to have to abandon the tried and tested release management process that we use for all of our other projects. AFAIK, even with delayed publishing, using the plugin publishing plugin would still require me to do that so it remains a non-starter.

Right now, I’ve avoided the problems by no longer publishing the plugin to the portal. No one’s complained so it’s good enough for now.

Would it solve problem if Artifactory itself supported publishing Gradle plugins to Gradle plugin portal as part of promoting staged releases?

I created feature request in Artifactory JIRA