Inferring Scala version is slow

(Ben Manes) #1

On a multi-module project the new Scala version inference is slow as the dependencies must be resolved before every task. Normally when running gradle tasks, it takes 4s with the daemon and 8s without. When using inference it takes 17s with and 26s without. Since inference is now preferred this seems like a bug, with an easy workaround of keeping the scalaTools configuration.

(Peter Niederwieser) #2

Thanks for the feedback. I wouldn’t expect inference to cause a noticeable slowdown, and I can’t reproduce it either (even for a multi-project build). Can you try to run the build several times in a row, both with and without configuring ‘scalaTools’? Also, can you try to give the Gradle JVM more memory? Are you doing anything special that could cause this?

(Ben Manes) #3

Increasing the memory setting had no effect. The wrapper is generated with customizations to DEFAULT_JVM_OPTS. Multiple executions show the same behavior.

If you open a JIRA ticket then I can attach our build scripts. We externalize confidential configurations.

The likely culprit is that we have a build structure promoting modules. A top-level gradle directory contains templates for a module to mix in behavior (base, java, scala, application, etc), primarily as common customizations to plugins. The modules apply the templates that suite them, such as a server type may be java application that has dependencies on library and service modules. This allows a single repository to host multiple server types and have reuse through build dependencies, instead of publishing to an artifact repository. This probably means that every scala module needs to be evaluated independently to dynamically determine what scala version its using.

(Peter Niederwieser) #4

I’m afraid I lack the context to fully understand what you are saying. Anyway, you can try this:

  • Add configurations scalaTools1, scalaTools2, scalaTools3, scalaTools4 to each project. * Add the same scala-compiler (repository) dependency to each configuration * Resolve all these configurations

How long does this take? That’s the kind of overhead that I expect inferencing to have. Since the scala-compiler dependency is already in the dependency cache, the slowdown shouldn’t be significant.