Is there a way to cache downloaded dependencies (e.g. from slow/external artifactory servers) in the build cache server so that Gradle only has to fetch them once when things change?
Currently, dependencies are not downloaded through the build cache.
Gradle will only download dependencies when it has not already downloaded it, or if it cannot be sure if the already downloaded copy is the same as what is in the repository (e.g. because the repository does not offer checksums). In most cases, Gradle already has the behaviour of only downloading changed/new things.
I suspect something else is going on in your environment. Can you please elaborate on the behaviour you are experiencing.
Thanks for your reply!
We are using a Jenkinsfile DSL pipeline with a clean workspace for each CI run (this is for testing that we can do a clean build). After cloning the repo, the first Jenkins stage invokes gradle assemble. This takes forever without using an existing .gradle directory; most of the time is spent downloading from our corporate Artifactory server.
We do not control the Artifactory server but are required to use it. Instead, we periodically run ./gradlew assemble with a new GRADLE_HOME and remove the script cache from GRADLE_HOME. Before each build, we rsync the template GRADLE_HOME to the build machine and set an environment variable pointing to it.
Now that I set up a build cache server, I am hoping there is a more elegant solution to our problem than what is described above. Specifically, I’m hoping to skip downloading dependencies without pre-creating a GRADLE_HOME and rsync-ing it to the build machines.
There’s nothing really built in here that will help.
Faced with this problem, I would explore using some kind of proxy cache in front of your Artifactory server that is deployed very close to where your builds run (i.e. making the transfer fast). You could use something like Nginx for this.