Dependency resolution always tries to look in external maven repo

I have noticed every time I run a taks that requires dependency resolution Gradle always looks in the external configure maven repo even no dependencies have been changed and as far as I can see I have no snapshot dependencies (if it is like maven and check for snapshot all the time). Tjis behaviour becomes even worse when I move from office to home and the company repos are not accessible the resolution takes a long time as I guess it keeps trying the unreachable repos?

The reason for contacting the maven repo is, to ensure that the artifact is still available in the defined repo. Otherwise the build may be succesful on one machine where the artifact was cached before and fail on another machine, where the artifact wasn’t cached before. You can always run your build with a --offline switch to avoid any repository communication.

cheers, René

To me it seems of the wrong default behaviour, I can understand wanting to stop the “it works on my machine problem”. But if the original repository that dependencies were downloaded from disappeared then this would be obvious to a person tying to build on another machine with a new build as Gradle would complain dependencies could not be found. What is there to gain from doing this check every time if dependencies have not changed. Seems like it should be option behaviour that you could pass in the command line so a developer (or ci server) can periodically check if all dependencies are available from the repos they have configure in their build?

Which Gradle version are you using?

The latest and greatest 1.0

Gradle shouldn’t (and normally won’t) look up dependencies again, unless you use changing modules (e.g. Maven snapshots), dynamic version (version ranges), or --refresh-dependencies. In the first two cases, you can configure how long the previous result is cached (default is 24 hours). So I’m not sure what’s going on. Does ‘–offline’ solve the problem?

Yes offline works fine

Can you post the debug output (’-d’) as a Gist? Are you sure the build isn’t using dynamic modules or version ranges?

When you say “external maven repository”, how is this declared. A known limitation of Gradle is that resolution from ‘filesystem’ repositories (with url like “file://”) bypasses the Gradle cache. Could that explain the behaviour you are seeing?

Without more context (repositories block, log output) it’s hard to provide more information.

I think I have found the offending dependency, which has introduced version ranges into it’s POM. [Meet the offender Spring Data JPA] (http://repo1.maven.org/maven2/org/springframework/data/spring-data-jpa/1.1.0.RELEASE/spring-data-jpa-1.1.0.RELEASE.pom) the POM has the below

[${spring.version.30}, ${spring.version.40})

No the question is how do I neutralise the offender, so Gradle does not keep looking?

Also very confused how all the dependency caching works. I have the below config for the repos allprojects{

repositories{

maven{url “http://atlas.intranet.co.uk:8021/artifactory/remote-repos”}

maven{url “http://128.30.240.235:8081/nexus/content/groups/public”}

} }

when I run gradle dependencies -d all I see is mention of trying to connect to the second repository and no mention of the atlas one? I have changed the order and no difference.

Even stranger when working away from office and can not access the repos if I comment out the second repo from which I download 3 of my 20 odd dependencies it complains it can not download them, this even occurs if I run with --offline . When I uncomment the second repo and run again in --offline it works fine?

Sorry I cant post on Gist due to company policy of code in the public domain, but maybe I can email stuff

Any ideas why Gradle want to check for new dependencies with dynamic versions? I would have thought in dynamic rangers if the dependency was satisfied why does it need to check again? So in the above example Spring data JPA is saying you can use any version of Spring 3.x, so I have Spring Jar’s all set to version 3.1.2 why does Gradle insist on checking remote repositories every bulid?

By default, Gradle will check for updates to dynamic versions once every 24 hours. You can configure this timeout: http://gradle.org/docs/current/userguide/userguide_single.html#sec:controlling_caching

Are you sure that Gradle is checking every build?

How resolution and caching works is outlined in the User Guide. Please review that and let us know if something still isn’t clear.

The important point is that Gradle caches artifacts per-repository. So when you comment out the repository, the artifacts are not available, even when running with ‘–offline’.

Can you please explain the rational of having a per repository cache? I would have thought that once the dependency was cached locally it was cached why does Gradle care where it originally came from?

Why does it check for update when the dependency has been fulfilled by dependencies declared in the build.gradle file? The Spring data example above being an example all the POM is saying is Spring Data JPA will work with any Spring 3.0.7 or greater version but not 4.x. I have all the Spring dependencies in my build.gradle set to version 3.1.2 so why does Gradle need to go to an external repo, the dependency has been fulfilled. I could understand if I had declared a dynamic range for my Spring jars in my build.gradle but I have not?

Gradle’s dependency conflict resolution will always attempt to use the newest version that matches a version range. This behaviour is different to Maven, which uses a slightly different algorithm for conflict resolution.

You have a version range (dynamic version) in your dependency graph (introduced by Spring Data JPA). To find out the newest version, Gradle must query the repository. To make this more efficient, by default Gradle will cache the result for 24 hours.

It’s not uncommon for different repositories to have different artifact files for the same coordinates ‘foo:bar:1.0’ over time. These artifacts could be compiled on different JVMs or with different optional packages included.

This scenario can lead to problems that are very hard to track down, when the ‘incorrect’ artifact remains in the cache from a completely unrelated build file.

Gradle is able to cache multiple different variations of any artifact: these are differentiated based on the SHA1 checksum of the artifact. So the first time you try to access an artifact from a repository, Gradle will first ask the repository the SHA1 checksum of the artifact in that repository. If the SHA1 checksum matches a cached artifact it will not be downloaded.

So Gradle does need to access the remote repository in order to determine the dependencies that will be served by that repository. This is done in an efficient manner, and the result will be cached indefinitely (except for changing modules).

I fully understand that if I had dynamic version range declared in the dependency graph that was not satisfied by a static version of that dependency in the graph. But where the dependency is fulfilled by a statically declared dependency I do not see what is gained by looking in the repository again, as no changes will happen in DAG until I change the version of the statically declared dependency. This update may still be within the dynamic range so again what is the need to check the repository again this will not change the DAG?