Gradle multiproject build

So I have a project structure as follows

|-- module-1
|-- module-2
|-- module-n

The final artifact we are looking at is the combined jar of all sources from all the modules. I am not sure whether this is the right way.
Here is my build-script :

plugins {
	id "com.github.johnrengelman.shadow" version "1.2.3"

group = 'com.root'
version = System.getenv('BUILD_NUMBER') ?:'1.0-SNAPSHOT'
apply plugin: 'java'
sourceCompatibility = 1.8

configurations { providedCompile }

repositories {

def sparkVersion = '2.0.0'
def scalaMajorVersion = '2.11'

sourceSets {
	main {
		java {
			srcDirs = ['module1/src/main/java', 'module2/src/main/java']

dependencies {
	compile group: 'org.scala-lang', name:'scala-compiler', version:'2.11.8'
	compile group: 'org.apache.kafka', name: 'kafka-clients', version: ''
	compile group: 'org.apache.spark', name: "spark-streaming-kafka_$scalaMajorVersion", version: '1.6.2'
	compile group: 'org.apache.spark', name: "spark-streaming_$scalaMajorVersion", version: "$sparkVersion"
	compile group: 'org.apache.spark', name: "spark-catalyst_$scalaMajorVersion", version: "$sparkVersion"
	compile group: '', name: 'gson', version: '2.7'
	compile group: 'org.apache.zookeeper', name: 'zookeeper', version: '3.4.9'
	providedCompile group: 'org.apache.spark', name: "spark-core_$scalaMajorVersion", version: "$sparkVersion"
	providedCompile group: 'org.apache.spark', name: "spark-sql_$scalaMajorVersion", version: "$sparkVersion"
	testCompile group: 'junit', name: 'junit', version: '4.11'


shadowJar {
	zip64 true

sourceSets.main.compileClasspath += configurations.providedCompile
sourceSets.test.compileClasspath += configurations.providedCompile
sourceSets.test.runtimeClasspath += configurations.providedCompile

artifacts {
	archives file: shadowJar.archivePath, builtBy: shadowJar

So we are just appending sourceSets when a new module is added.

I just wanted to confirm whether this is the right way to build a multi-module project, if not what is the right way? and how to account for inter module dependencies?

You might want to read the docs on multi-project builds, it covers this topic pretty well:

There are also plenty of examples in the samples/userguide/multiproject/ folder of the ‘-all’ distribution of Gradle.

If you structure things correctly there is no need to manually update the sourceSets.

Generally for a multi-project build the structure is like this:

│       └───build.gradle
│      └───build.gradle
│    ├───module3a\
│    |       └───build.gradle
│    ├───module3b\
│    |       └───build.gradle
│    └───module3c\
│             └───build.gradle
│      └───build.gradle
│      └───build.gradle


At the top-level is the root project that has it’s own build.gradle, this is where you would typically define your “fat Jar” task to include all the sub-projects. You can also put tasks that are common to sub-projects tasks here.

There is one (and only one) settings.gradle file. This is where you include what sub-projects should be part of the multiproject build.
e.g. = 'MyMultiProject'
include 'module1', 'module2',

Each sub-project would then have its own build.gradle to define sub-project specific tasks.
If you’ve applied the required plugins and defined all the necessary tasks in the root project (e.g. subprojects {…}, allprojects{…}) then each sub-project doesn’t necessarily need its own build.gradle file.

Dependencies would be handled in the sub-projects’ dependecies block.

dependencies {
    compile project(':moduleY'), project(':moduleX:modZ')
    testCompile 'org.testng:testng:6.9.4'

I have not used the shadowJar plugin so I can’t comment on how to configure it to include all your sub-projects. From what I’ve read I know it is possible though.

Keeping aside shadow jar thing, I followed what you have mentioned above. So one separate jar is created for each module containing compiled output of that module and a root jar is also created which doesn’t contain anything apart from Manifest.

So what I need to achieve is the root jar should contain compiled output of all the modules. How can I achieve it?

I am now able to include the compile output of all the modules to root jar.

I used following approach :

shadowJar {
	zip64 true
	subprojects.each { subproject ->
		from subproject.sourceSets.main.output.classesDir
		from subproject.sourceSets.main.output.resourcesDir
        configurations = [subproject.configurations.compile]

Now I would just like to know the advantage of using the approach that you have mentioned over the one that I have given in question (i.e. adding it to sourceSet when a new module is added)?

You’ll find many of the advantages in the user guide :slight_smile:
Chapter 8. Introduction to multi-project builds
Chapter 25. Multi-project Builds

The O’Reilly e-book Building and Testing with Gradle also has useful information,
There are plenty of other online resources that can offer more info.

To get you started:

Off the top of my head here are some other items to consider:

  • One large advantage is it is easier to write and maintain a mult-project build vs as large monolithic build.
    To quote the User Guide

It’s often much easier to digest and understand a project that has been split into
smaller, inter-dependent modules

  • It enables sharing common configuration between the child projects, for example by applying the same sets of plugins and dependencies while also allowing child projects to override or extend this if needed.

  • Cleaner (easier to understand) and finer grained dependency management for modules.

  • Better extensiblity, re-usability, etc.

  • Incremental builds, you can build only the modules you care about and you don’t have to worry about the inter-project dependencies as Gradle handles that for you.

  • You can run tests and execute tasks within specific subprojects without building the entire world.

  • A well written multi-project build is faster as Gradle can run tasks in parallel on multi-core systems

Of course all of the above depend on the project your working with, the complexity of the code base, deployment considerations, external libraries, team size, etc. Other people may disagree with some of the above points for these or other reasons. Ultimately you will need to decide for yourself based on your requirements and constraints what type of build best suits your needs.