Gradle-worker.jar process using too much memory

I have supplied -Xmx1024 in GRADLE_OPTS and in gradle.properties, but a java -cp gradle-worker.jar process is taking over 4gigs of memory, how do I configure this process to use a fixed amount of memory? The CI platform we’re using only allows a combined 4gigs across all of the containers.

Here is the memory usage report:

PID

RSS %CPU COMMAND

5433 4425544 96.5 /usr/lib/jvm/jdk1.7.0/bin/java -javaagent:/home/ubuntu/.gradle/caches/modules-2/files-2.1/com.newrelic.agent.android/class-rewriter/3.407.0/6f7f95a36369f76b739461331202406fa1f519c7/class-rewriter-3.407.0.jar -Dandroid.assets=/home/ubuntu/Android/app/build/intermediates/assets/debug -Dandroid.manifest=/home/ubuntu/Android/app/build/intermediates/manifests/debug/AndroidManifest.xml -Dandroid.resources=/home/ubuntu/Android/app/build/intermediates/res/debug -Djava.security.manager=jarjar.org.gradle.process.internal.child.BootstrapSecurityManager -Dfile.encoding=UTF-8 -ea -cp /home/ubuntu/.gradle/caches/1.10/workerMain/gradle-worker.jar jarjar.org.gradle.process.internal.launcher.GradleWorkerMain

5045 415676 117 /usr/lib/jvm/jdk1.7.0/bin/java -XX:MaxPermSize=512m -Xmx512m -Dfile.encoding=UTF-8 -cp /usr/local/gradle-1.10/lib/gradle-launcher-1.10.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 1.10 /home/ubuntu/.gradle/daemon 120000 51920caf-a858-47db-97b9-036de3e1de7c -XX:MaxPermSize=512m -Xmx512m -Dfile.encoding=UTF-8

4780 326000 7.0 /usr/lib/jvm/jdk1.7.0/bin/java -Xmx1024m -XX:MaxPermSize=1024m -Dorg.gradle.appname=gradle -classpath /usr/local/gradle-1.10/lib/gradle-launcher-1.10.jar org.gradle.launcher.GradleMain --info --project-dir /home/ubuntu/Android/app test

GRADLE_OPTS controls the Gradle process itself. Processes that Gradle launches are controlled by the task that launches them.

So, what is the 5433 process? Is this running tests?

Hey, turned out that java was allocating way too much memory if maxHeapSize isnt specified in the test DSL. Setting maxHeapSize in the test (robolectric) DSL resolved the issue.

Thanks!