So I basically have a Spring Boot server, which will submit jobs to a Spark cluster. To run these jobs on Spark, I must somehow get the jar with the job code (in Java) to the cluster. To avoid doing things manually, I thought it would be nice if my Spring application could upload the jar on server start.
Therefor I’m trying to add the job jar as a resource to my Spring App. The project structure is as follows:
main
├── spark (small pure Java module with Spark job code)
├── ... other modules
└── rest (Spring Boot server)
Relevant snippet from Gradle (copies fat jar from subproject to main project resources):
gradle.projectsEvaluated {
task addSparkJobs(type: Copy, dependsOn: ":spark:shadowJar") {
description "Ensures job jar for Spark is present"
from project(':spark').libsDir
include 'spark-*-all.jar'
rename '.*', 'spark-all.jar'
into "${sourceSets.main.output.resourcesDir.toString()}/jobs/"
}
jar.dependsOn addSparkJobs
}