Gradle unable to properly authenticate against Artifactory

TL;DR; I’m getting sporadic 401/403 responses from Artifactory when Gradle tries to GET / HEAD my repositories for its dependencies. These requests are coming in as “non_authenticated_user” in my Artifactory logs which makes no sense because I’m seeing other requests from the same build coming in as the proper user which get properly resolved. I have no idea what’s going on, why does it look like Gradle is sending some requests to Artifactory with proper credentials and some with invalid credentials?

The longer version
We’re in the process of moving our CI from local Jenkins to GitLab. We have a few Gradle builds (regular Java/Scala projects, nothing special really) that need to get Artifacts from both the “global” repos (jcenter and mavenCentral) as well as from our internal Artifactory repository.

Some of these projects are working fine but we’re having problems with some of the others. The issue is that the build fails with a 403 from Artifactory when trying to fetch some of the artifacts from the internal repos.

As I said, we’re seeing this problem with at least two of our projects. The first one we took a look at we ended up fiddling around a bit with the order of the repository definitions in the Gradle file, upgraded Gradle to 4.10.3 and then everything started to work fine and has been stable ever since.

I should note that we haven’t been experiencing this problem on local development machines, except for one developer who says he’s seen this but usually building the project again fixes the problem. We saw this happen every now and then on our old Jenkins CI but the same there, retrying the build usually worked. In GitLab, we see this problem more frequently and for one of our projects, its been pretty consistent.

Long story short, this is simply due to the fact that the 403 responses are sporadic and unpredictable. Building a few times usually builds up the Gradle cache on dev machines and on Jenkins as Gradle re-uses the same directory for GRADLE_HOME. On GitLab, things are a bit more isolated so every new job gets a fresh GRADLE_HOME. We’ll try to fix that of course, but that would only be slapping a band-aid on the problem. This realization made it possible for us to reproduce the problem on our local machines, simply delete the Gradle cache before the build and you will (often… but not always) run into a 403 from Artifactory.

For the second project. We’re using Gradle 5.6.2 there so we can’t upgrade that any further. This project keeps giving us 403 errors in GitLab CI. The problem is worse if we configure GitLab to check out a rather large submodule containing test data. This makes no sense since Gradle hasn’t even started when the submodule checkout is performed. But if we checkout the submodule we almost always get the 403 from Artifactory, if we don’t check out the submodule, the build goes through.

So, here is the definition of our repositories in our Gradle build file for this second project:

repositories {
    jcenter()
    mavenCentral()
    maven {
        url "${artifactory_contextUrl}/libs-release"
        credentials {
            username "$artifactory_user"
            password "$artifactory_password"
        }

    }
    maven {
        url "${artifactory_contextUrl}/libs-staging"
        credentials {
            username "$artifactory_user"
            password "$artifactory_password"
        }
    }
    maven {
        url "${artifactory_contextUrl}/libs-snapshot"
        credentials {
            username "$artifactory_user"
            password "$artifactory_password"
        }
    }
}

We are using the -Dorg.gradle.daemon=false flag so that should not be messing with anything and I’ve verified that the values for artifactory_user and artifactory_password are correct.

These three repositories are all virtual repositories that point to one or two concrete (local) repositories in our Artifactory. I’ve verified that the user we’re using has read access to the virtual repos as well as the local repos.

As I said, the Gradle build ends in a bunch of 403 errors (it differs which artifacts give us a 403 response) but here is the relevant log from Artifactory from a single build run that quickly resulted in a 403 error:

20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-release/com/mycompany/ourproduct/ourproduct-model/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|401|0
20190926112214|11|REQUEST|10.3.104.55|build|GET|/libs-release/com/mycompany/ourproduct/ourproduct-model/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|404|0
20190926112214|2|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-release/com/mycompany/ourproduct/ourproduct-model/1.2-SNAPSHOT/ourproduct-model-1.2-SNAPSHOT.pom|HTTP/1.0|403|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-release/com/mycompany/ourproduct/ourproduct-base-exception/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|401|0
20190926112214|2|REQUEST|10.3.104.55|build|GET|/libs-release/com/mycompany/ourproduct/ourproduct-base-exception/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|404|0
20190926112214|0|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-release/com/mycompany/ourproduct/ourproduct-base-exception/1.2-SNAPSHOT/ourproduct-base-exception-1.2-SNAPSHOT.pom|HTTP/1.0|403|0
20190926112214|0|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-model/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|401|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-model/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|403|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-base-exception/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|401|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-base-exception/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|403|0
20190926112214|0|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-model/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|401|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-model/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|403|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-base-exception/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|401|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-base-exception/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|403|0
20190926112214|1|REQUEST|10.3.104.55|build|GET|/libs-release/com/mycompany/ourproduct/ourproduct-base-config/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|404|0
20190926112214|0|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-release/com/mycompany/ourproduct/ourproduct-base-config/1.2-SNAPSHOT/ourproduct-base-config-1.2-SNAPSHOT.pom|HTTP/1.0|403|0
20190926112214|0|REQUEST|10.3.104.55|build|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-base-config/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|404|0
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-base-config/1.2-SNAPSHOT/ourproduct-base-config-1.2-SNAPSHOT.pom|HTTP/1.0|403|0
20190926112214|5|REQUEST|10.3.104.55|build|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-base-config/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|200|782
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-base-config/1.2-SNAPSHOT/ourproduct-base-config-1.2-20190926.102550-80.pom|HTTP/1.0|403|0
20190926112214|1|REQUEST|10.3.104.55|build|GET|/libs-release/com/mycompany/ourproduct/ourproduct-drivers/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|404|0
20190926112214|0|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-release/com/mycompany/ourproduct/ourproduct-drivers/1.2-SNAPSHOT/ourproduct-drivers-1.2-SNAPSHOT.pom|HTTP/1.0|403|0
20190926112214|0|REQUEST|10.3.104.55|build|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-drivers/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|404|0
20190926112214|0|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-staging/com/mycompany/ourproduct/ourproduct-drivers/1.2-SNAPSHOT/ourproduct-drivers-1.2-SNAPSHOT.pom|HTTP/1.0|403|0
20190926112214|4|REQUEST|10.3.104.55|build|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-drivers/1.2-SNAPSHOT/maven-metadata.xml|HTTP/1.0|200|778
20190926112214|1|REQUEST|10.3.104.55|non_authenticated_user|GET|/libs-snapshot/com/mycompany/ourproduct/ourproduct-drivers/1.2-SNAPSHOT/ourproduct-drivers-1.2-20190926.102550-80.pom|HTTP/1.0|403|0

As you can see, some requests are done by the “build” user but others are reported as “non_authenticated_user”.

This makes no sense to me as I’m providing the proper credentials (otherwise I’d have no “build” user requests) and there is only a single Gradle process running.

Has anybody seen anything like this? Any ideas what might be going on here?

Could you be impacted by https://github.com/gradle/gradle/security/advisories/GHSA-4cwg-f7qc-6r95 ?

In short, if your virtual repos end up redirecting to a different host, then Gradle 5.6+ will no longer reuse the credentials.

That seems unlikely. We only have a single Artifactory server and the three virtual repositories I mention in the question are all composed of only local repositories on the same server.

A bit more information:

Every build I do, whether it’s successful or not, produces some requests that have “non_authenticated_user” intertwined between the properly authenticated requests.

Many of the requests are checking whether they find an artifact in the wrong repo. That is, when Gradle tries to find a snapshot version of an artifact, it will start querying the release and staging repos as they are defined first. This will result in a 404 if the request is authenticated or a 401 if the request is one of the weird “non_authenticated_user” requests. This is fine, and Gradle will happily just continue down the list.
However, if it happens that a request to find some snapshot version ends up as a non-auth request to the snapshot repository it ends in a 403 (forbidden), which is the correct response as a non-auth user is not permitted to get that artifact. But this fails the Gradle build.

I’m now trying to figure out a way to intercept all the HTTP requests made by Gradle to see what the credential headers are set to. This is not proving to be easy, I’m trying to use mitmproxy for this but the SSL handshake is failing even though I’ve added the mitm certificate to the CA certs file for the JVM that is running Gradle. But if I get that working I will be able to verify that Gradle is in fact sending out the wrong credentials with these requests (and it’s not Artifactory that’s just failing to properly authenticate the correct credentials.

I’ll report back if I figure that out… all pointers would be much appreciated.

… and just a bit more.

I just took one of the URLs that resulted in a 403 in my Gradle build before and created a curl command that requested the same resource and used the same user/password to authenticate that. I ran that in a loop for a while and every single request was properly authenticated in the Artifactory log.

So it really looks like Gradle is messing up the credentials in some of these requests.

Not quite sure if I should start a new thread about this but I’ve boiled the problem down to an absolute minimal setup.

I ran the official Artifactory docker container locally:
docker run --rm --name artifactory -d -p 8081:8081 docker.bintray.io/jfrog/artifactory-oss:latest

Open up a web broswer on http://localhost:8081 and log in with admin:password (default admin account in Artifactory).

Then I went through the “Welcome Wizard”. Skipped the proxy step but in the second step I selected to create the default “Maven” repositories. This will create the following repositories:

  • libs-snapshot-local (local repository type)
  • libs-release-local (local repository type)
  • jcenter (distribution repository type)
  • libs-snapshot (virtual repository type)
  • libs-release (virtual repository type).

The two virtual repositories (libs-snapshot and libs-release) are configured to “point to” the respective local repositories as well as the jcenter distribution repository.

Next I run a mitmproxy in “reverse proxy” mode to intercept all requests so that I can take a look at them:
sudo mitmproxy --mode reverse:http://localhost:8081

By default, mitmproxy uses port 8080 so now I have an “Artifactory server” running on port 8080 on my machine.

Next I created a dead simple Gradle project that set up a connection to this local Artifactory server via my mitmproxy (localhost:8080). You can see the build file here:

So now… if I run a rm -rf ~/.gradle/caches/ && ./gradlew -Dorg.gradle.daemon=false clean assemble I can see all the requests that Gradle makes to my artifactory server.

And lo and behold… the first few requests get a 401 response. Looking at the content of the requests, I can see that there is no autorization header being sent with them!

As you can see, after a while the responses change to 200 and looking into those responses, I see that the authorization header is included.

So I’m a bit confused… should Gradle be sending any requests that do not contain auth headers to the artifactory repository? Both of them (libs-snapshot and libs-release) are defined in my Gradle project with a credentials block so it really doesn’t make any sense that any requests are sent without the auth header right?

My experience with Artifactory is that it requires preemptive auth depending on how it is configured. You can force Gradle to send auth headers with the initial connection by adding the following block inside your maven { } declaration:

authentication {
  basic(BasicAuthentication)
}

Thanks! You’re absolutely right, adding that did result in Gradle sending the auth headers immediately.

I doubt this will fix our 403 problem (the one I described at the top) but at least I can put this in to make sure that’s not causing any problems.

Hi @StFS did you ever manage to fix the issue in the end? It’s 2023 and now we are facing the exact same issue. Users are migrating from Jenkins to GitLab and are getting the same sporadic issue with our Artifactory instance.