How do you force Gradle to include an artifact in a fat/uber jar?

I’m trying to build a fatjar. No matter what I do, one particular dependency is never included. The dependency is listed in my dependencies { […] },

e.g.

dependencies {
    // Explicityly added to in hopes of being included in the fat jar
    compile group: 'org.apache.hadoop', name: 'hadoop-common', version: '2.7.0-me-1803
 
    // dependencies listed in the pom.xml
    compile group: 'com.me.ojai', name: 'me-ojai-driver', version: '6.0.1-me'
    compile group: 'com.fasterxml.jackson.core', name: 'jackson-databind', version: '2.6.5'
    compile group: 'org.apache.kafka', name: 'connect-json', version: '1.0.1-me-1803-streams-6.0.1'
    compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.2.1'
    compile group: 'org.apache.spark', name: 'spark-streaming_2.11', version: '2.2.1-me-1803'
    compile group: 'org.apache.spark', name: 'spark-streaming-kafka-0-10_2.11', version: '2.2.1-me-1803'
    compile group: 'org.apache.kafka', name: 'kafka-clients', version: '1.0.1-me-1803-streams-6.0.1'

    // dependencies not in the pom.xml, but added to make the gradle build work
    compile group: 'xerces', name: 'xercesImpl', version: '2.11.0'
    compile group: 'org.json', name: 'json', version: '20171018'
    compile group: 'com.google.guava', name: 'guava', version: '19.0'

}

but regardless of whether I use the fatjar task

task fatJar(type: Jar) {
    zip64 true
    baseName = project.name + '-all'

    from {
          configurations.runtime.filter( {! (it.name =~ /.*\.pom/ )}).collect {
             it.isDirectory() ? it : zipTree(it)
         }
    }

Or use the shadowJar plugin:

shadowJar {
    zip64     true
    baseName 'ms-edge'
}

The the classes in the org.apache.hadoop.hadoop-common-2.7.0.me.1803 artifact are never included. What am I missing?

What is that filter for? Have you tried it without that filter? I don’t see why that filter would matter for that jar but everything else looks correct. Try it without that filter and see what happens.

Because without the filter, I get the following error:

Task :fatJar FAILED
FAILURE: Build failed with an exception.
What went wrong:
Execution failed for task ‘:fatJar’.
Could not expand ZIP ‘/home/me/.gradle/caches/modules-2/files-2.1/org.apache.drill/drill-client/1.13.0/c90c87887c292a3712eccc7cebdc48b0b9d93ec9/drill-client-1.13.0.pom’.
Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use ‘–warning-mode all’ to show the individual deprecation warnings.
See Command-Line Interface - Gradle User Manual

The only way I could see to fix the error is to let zip know that a .pom cannot be unzipped.

That is odd, I have never had gradle try to put a pom in my fat jar. Can you see if you get different behavior if you change “runtime” to “compile” in your from closure. I have always done it with compile…although I see no reason runtime wouldn’t work.

I updated the fatJar task from runtime to compile.

task fatJar(type: Jar) {
    zip64 true
    baseName = project.name + '-all'

    from {
          configurations.compile.collect { it.isDirectory() ? it : zipTree(it) }
    }
}

Still the same “Could not expand ZIP …” error.

I have used that task in probably a half-dozen projects to create a fatJar and it has always just worked. Your issue is a real head-scratcher.

I added the filter back which put me back to my original problem. org.apache.hadoop.hadoop-2.7.0-me-1803.jar is still not being included in the fat jar.

In the excerpt of your dependencies the version of the hadoop dependency is missing the ending tick. Is it really like that or is that just a product of cut/pasting into the forums?

It’s a Cut-n-Paste error. I’ve found a workaround which involves downloading the jar, unzipping the jar, and then appending the classes to the existing fat jar. Apparently, others have run into this issue too. I’m reusing their recommended workaround-code to build my fat jar.