Managing artifact cache growth

What is the recommended way to manage the size of Gradle’s artifact cache? One of our build servers ran out of space recently as the artifact cache exceeded 150GB.

For example, is there some setting in Gradle that will cause it to trim least-recently-used artifacts?

Or does someone have a script they typically use for this?

Associated issues:

For now i think all you can do is throwing the cache away manually. On your buildserver, you could add a cron job for this.

We found that the vast majority of disk usage is in the “temp” directory: [GRADLE_HOME]/caches/artifacts-[N]/filestore/temp. It seems that every run of Gradle (every execution of a Jenkins build job, in our case) contributes JAR’s to this directory, though I haven’t tried to prove it.

We are working on cron job such as Rolf suggests.

Dan, that would be odd if temp is growing. We do try and clean that up. Any chance you could confirm that that is happening? If so, could you also provide environment info such as OS, Java version etc. please.

When I get a chance, I’ll try to confirm if it happens every single time, or if there is some other factor involved.

I can say for sure that on April 25, our /temp directory was at about 16 GB. I cleared it out, and then we got back up to about 16 GB again on May 8. The rest of the filestore directories consume only about 33 MB altogether.

Gradle 1.0-milestone-8a
  Gradle build time: Monday, February 20, 2012 4:00:18 PM UTC
Groovy: 1.8.4
Ant: Apache Ant(TM) version 1.8.2 compiled on December 20 2010
Ivy: 2.2.0
JVM: 1.7.0_01 (Oracle Corporation 21.1-b02)
OS: Linux 2.6.18-238.9.1.el5 amd64

I looked into this. It looks like a build job writes to the /temp directory only if we have passed “–refresh dependencies” (we’re using 1.0-m8a). We pass this switch on many build jobs because otherwise we periodically encounter builds that fail due to stale dependencies. So it’s a workaround. But is it intended that the /temp directory should not be cleaned when dependencies are refreshed?

We pass this switch on many build jobs because otherwise we periodically encounter builds that fail due to stale dependencies.

An alternative is to use Fine-tuned control over dependency caching.

Interesting. I guess our issues stem from “Gradle caches dynamic versions for 24 hours”. We are using dynamic versions to specify dependencies within our projects. My hesitation is because I don’t think we encountered stale dependencies as frequently as I might expect for a 24 hour cache. And I don’t think we ever see stale dependencies for artifacts published to an Ivy filesystem repository on local developer machines.

In any event, refresh dependencies is definitely a big hammer since it affects all the third-party dependencies, which are not dynamic in our build scripts.

Garbage collection for the cache is definitely on our list. As said above by Rolf, right now you have to do it manually.

We just ran across this. Our CI server kept blowing up from lack of disk space, as our ~/.gradle/caches/artifacts-14/filestore/temp dir grew without end. To ensure “clean” builds, our admins decided --refresh-dependencies was a good idea, and that always adds stuff to the temp dir.

By default, it looks like Gradle never cleans out temp after downloading artifacts, so Gradle always takes at least 2x as much space on the file system as it should.

Our solution will be to implement a cron job to remove anything older than a day or so, but it’d sure be nice if Gradle did this for us.


In the newly-published 1.3-rc-1 release notes, I noticed the filed and fixed GRADLE-2547. Is the behavior referred to in the JIRA issue equal to (or related to) the behavior for the ‘temp’ subdirectory as discussed in this forum topic? Thanks.