Generated Eclipse .classpath files are not "stable" if dependencies comprise changing modules

I need to redistribute gradle generated Eclipse .classpath files for convenient use by other develoeprs in our department. Now I experienced a problem with changing modules, i.e. modules that keep their name but underly ongoing changes. Unfortunately the generated .classpath makes direct use of the local gradle chache, which uses some kind of SHA keys to distinguish certain versions of the same artifact (name). Thus the generated .classpath files are always out-dated if one of the dependency changing modules changes.

Is there a way to introdcue some kind of indirection, e.g. mirror the cache to a stable structure and let the .classpath entries point to this structure?

Do you mind explaining why you need to check in a .classpath file?

Typically, developers using Gradle don’t do this. Instead, they run ‘gradle eclipse’ to generate the .classpath for the session.

We need to support a bunch of develoeprs that are neither used to gradle nor like to deal with build tools at all. Thus running some kind of gradle resolveAllDependencies from time to time is all what would be acceptable. On the other hand we are running a continuous build server, that might rebuild the changing modules every 120 seconds without effectively changing the code, e.g. due to triggers by parent Projects. Thus even during a session it might happen that the generated eclipse .classpath may be broken.

Maybe a solution to our problem is to use a sync Task to copy dependent jars to a local M2 repository and customize the eclipse .classpath Generation to Point to this repo. Would that be feasible or bad prcatice’?

I don’t quite see how a local repo is going to help you, particularly if you are trying to integrate upstream changes.

What most people do in this situation is include a ‘refreshDependencies.bat’ (or ‘.sh’ if you’re not on Windows) that simply calls ‘gradlew eclipse’. Then, all your developers need to do is double click this file and then do a refresh in Eclipse.

You are right for the client-side integration of upstream changes. However, as I tried to explain effective or API changes are quite rare in comparison to the rebuild frequency. Since the server-side continuous-build will handle dependencies appropriatly, broken builds on the server side may be used as hint to those developers that are rarely refreshing dependencies. I’m not happy about this, but I know, that that additional refresh in Eclipse won’t be accepted.

Said this, the use of the local repo solely serves as a mean to get rid of the ‘SHA-named’ directories in the .classpath entries.

Said this, the use of the local repo solely serves as a mean to get rid of the ‘SHA-named’ directories in the .classpath entries.

Why does the name matter?

Even if you use a different directory you still need to populate it somehow, which means running Gradle or some other process. I can’t quite see what the difference would be.

One more idea: If the local repo is maintained on a shared file store and frequently updated by the continuous build even client-side integration of upstream changes might work (better). However, the additional refresh in Eclipse will be necessary too.

Have you looked at the Gradle plugin for Eclipse?

http://marketplace.eclipse.org/content/gradle-integration-eclipse#.UWLCdatARd4

This removes the need or a .classpath file at all as Eclipse will source the info directly from Gradle.

I had a look at the plugin. I guess you mean http://static.springsource.org/sts/docs/latest/reference/html/gradle/.

Unfortunately this plugin has certain issues that cause problems, e.g.

1.) it seems that an explicitly defined rootProject.name by means of settings.gradle is not considered.:

Invalid project description. Invalid project description. OK C:\DEV\workspaces\TRUNK\XXXX overlaps the location of another project: ‘XXXX’

2.) import fails on conflicting slf4j version (we already use 1.7.5) if DSL support is enabled (default)

The name matters since it changes to often in our case. Sometimes we get a new build every 120 seconds and the only thing that has changed was a date or build tag inside a Manifest file.

Thus in most cases it would be sufficient to pick up the real changed JARs only once a week.

The other problem is that we are dealing with multi multi-module-projects, i.e. multi-module projects that depend on other multi-module projects. Since project dependencies can only be specified inside each of the multi-projects, we cusomized the generated .classpath in order to refer to sibling multi-module projects. Unfortunately we can’t distribute these cusomized .classpath files since the gradle cache internals are not transparent, thus our files are continuously out-dated. I understand that these interanls are necessary and useful to run the build, however I think that ‘ChangingModules’ are used to keep configurations such as .classpath stable.