But could never get one of them to do what I want.
Iβm not sure I understand why, is project.ext in a task being set after dependencies have already been resolved? Is there a more idiomatic way of doing this?
Actually that plugins readme and code, as well as your build scripts are full of bad practices, so it is hard to tell how it is supposed to be used. You probably better ask that plugins maintainer how it is intended to be used.
That your configurations are not doing what you want would need some debugging, but I think the problem is that you do things too late for the plugin to see it, as it uses the bad afterEvaluate which in your cases probably is evaluated before you do your settings, as in the first case you use cross-project configuration and in the second case only do the configuration when the respective tasks are configured which also is too late for what that plugin does.
Besides that in your first case you overwrite each other if multiple of those tasks are requested, only do it if they are requested with full path but not abbreviated, and in your second case overwrite each other even if the tasks are only configured but not going to be executed.
A better solution to this would most probably be to use component metadata rules to properly configure the variants for those dependencies and then use variant-aware building, similar to what the latest verion of the JavaFX Gradle plugin is doing now.
Oh wow! This was exactly what I was looking for but somehow never came across. So if I were to do employ a variant rule like this one from the user guide, how would I select between runtime variants of dependencies in a task like :core:windows (Example 14 takes me 90% of the way there), have wrapper configurations that assert the OS attribute?
If you have the time/charity to take a look through the full build scripts and give feedback Iβve listed them here. The source repository is also available here.
If you want cross-platform building for several OS within one Gradle run, you could for example use different configurations, or even just different artifact views that also have withVariantReselection(), depending on what you need and can use in those tasks.
If you only want to build one platform at a time, it would probably be enough to just configure the artifacts for the runtimeClasspath configuration as shown in the example you mentioned.
If you have the time/charity to take a look through the full build scripts and give feedback
You mean regarding the bad practices I mentioned?
Well, as you asked for it, I took a cursory look, here some points
Do not use mavenLocal(), especially not as first repository and especially not without repository content filter. It is broken by design and also makes builds slower. (Declaring repositories)
Do not use buildscript { ... } repositories and dependencies for plugins. Better use pluginManagement { repositories { ... } } for the repos and then it should just work with plugins { ... } block, if it does not because some plugin does not publish the marker artifact, report a bug and use pluginManagement { resolutionStrategy { ... } } to do the mapping from plugin id to dependency.
Do not use allprojects { ... }, subprojects { ... }, project(...) { ... } and similar, they immediately introduce project coupling through cross-project configuration which makes builds harder to understand, harder to maintain, and also works against some more sophisticated Gradle features. Better use convention plugins in buildSrc or an included build, for example implemented as precompiled script plugins.
Do not use the legacy apply plugin syntax, but the plugins { ... } block to apply plugins.
Do not use ext / extra properties, they are almost always just a work-around for doing something not properly. For the version declarations, better use a version catalog. For other thins, if you only need some variable within the same build script, just use a normal local variable, if you want something available in all projects that is static like appName, better just put them to your projectβs gradle.properties file.
Strongly consider also putting JitPack as low in the list as possible if you need it at all and adding repository content rules. It is also broken by design for normal releases, also sometimes like misbehaving and destroying Metadata like Gradle Module Metadata file in the try to do their broken magic. Besides that it is also super slow, especially if something was not requested and thus built before.
Be careful with checking startParameter.taskNames, you can for example also do ./gradlew core:win or ./gradlew c:w or ./gradlew windows and your logic will match neither of them
Never use implementation files(...) or implementation fileTree(...), but better us a flatDir repository. Indeed you define a flatDir repository already, you just donβt use it.
Consider using the JVM toolchains feature instead of ...Compatibility to decouple the Java version used to run Gradle from the Java version used to build and run your production code.
Never use new File(...) with a relative path as argument ever in any JVM project except for one rare case, when you get a relative path from an enduser for example as commandline argument, so for Gradle build logic never, ever. new File(...) with a relative path is resolved relative to the current working directory. For a Gradle build this sometimes is the directory where the user invokes Gradle, but often it is not, and even if it is, this might be the the project directory, but not necessarily. Just use for example in this case workingDir = file("../assets") as file(...) is a Gradle method that resolves the path relative to the project directory as intended.
Instead of an own run task, you could simply use the application plugin which defines the task properly for you and you just have to configure the main class in the applciation extension.
My personal recommendation, do not build a fat jar at all. They are imho very bad practice (except if done right like Spring boot is doing), can easily break things and often even cannot work properly. See https://fatjar.net/ for some more quotes. If you really need one, maybe at least use the shadow plugin which at least works-around some of the pitfalls even if not all. And if you prefer to keep your custom logic, at least remove your useless Class-Path attribute. The way you set it, it has invalid content that can never have an effect and even if you would fix that, it would make the Jar only usable on your computer and also only until the Gradle cache is cleaned eventually. But as you do the bad-practice, i.e. the repacking of the libs into the jar, it just doesnβt matter that it is invalid.
Do not add explicit dependsOn between tasks, except with a lifecycle task on the left-hand side. Whenever you otherwise need it, you have a code smell. You should usually wire task inputs to task outputs and by that get necessary task dependencies automatically. The case you have also is a no-op, as you do with jar, the dependency on the class compilation tasks is there already as necessary.
I guess - or hope - the three Exec tasks are just some experiment, because it makes absolutely no sense to to have an Exec task that just executes true, let alone three of them. If you want a task that does nothing, define a task that does nothing, not a task that starts an external process that just does nothing. Besides that, as long as you do not support building for multiple platforms in the same build run, do not use a task to distinguish which to build for, but for example a project property.
And finally more a personal recommendation, switch to Kotlin DSL. By now it is the default DSL for Gradle, you immediately get type-safe build scripts, you get actually helpful error messages if you mess up the syntax, and you get amazingly better IDE support when using a proper IDE like IntelliJ IDEA or Android Studio.
Thank you so much for all your feedback! I took some time to rewrite my build files in Kotlin. But before I go on to ask further questions:
Sadly this isnβt within my control. This project is a video game and updates/modifications are distributed as drop-in fatjars.
I rewrote the whole thing, used the shadow plugin, and replaced the javacpp plugin with an inline rule as you suggested (any best practice tips once again appreciated )
As for the questions, the game engine libgdx brings in a lot of unused transitive dependencies under org.lwjgl, Iβd love to have a rule similar to what Iβm using for JavaCpp but for Lwjgl however I canβt seem to come up with something that excludes all of the unused (arm, multilib, non-target platform etc.) natives programmatically for the configuration while adding the needed dependencies to the compile/runtime classpaths.
This was a naive attempt just to try and override the used dependencies to be platform specific but I kept getting jars with missing shared objects. Trying to exclude the transitive dependencies on lwjgl3 from imgui and libgdx didnt have the desired result
I think Iβm just completely off base with my approach, any help appreciated!
Edit: I think I see now that those excludes will not exclude the artifacts but the whole moduleβ¦ after resolution?? Iβm not quite clear on the semantics
I would use the TOML variant of the version catalog. As you only have simple definitions, not some dynamic calculations or similar, you loose the advantage that only the things that actually use a specific version are out-of-date. If you change the settings script, all tasks are out-of-date as the runtime classpath changes.
Also, is there a reason to still have some of the dependencies still hard-coded in the build script instead of adding them to the version catalog?
https://oss.sonatype.org/content/repositories/releases as repository probably just wastes time? All that is found there, should also be on mavenCentral() as it is its pre-stage, isnβt it?
What is the intention of platformImplementation? From only what you shown, it is identical to runtimeOnly.
I would use configurations.runtimeClasspath { attributes { ... } } instead of configurations["runtimeClasspath"].attributes { ... } to use type-safe instead of stringy API.
You donβt need tasks.named<ShadowJar>("shadowJar"), just use the generated type-safe accessor tasks.shadowJar. The named syntax is only necessary for tasks created dynamically, or when you are not in the build script where the plugin that adds it is applied using the plugins { ... } block.
If you need to use archiveBaseName.set(...), I would recommend updating Gradle. Once you did, you can simply use archiveBaseName = .... Wherever you read .set(...) for a Property somewhere, that is just for older versions of Gradle necessary. (same for other places like mainClass.set(...) and so on)
Donβt ever comment in that isCanBeResolved abuse.
As you commented corretly, that is non-sense and might easily break other things, or even be forbidden in later Gradle versions.
Either create a new configuration that extends from it and is resolvable, or just use an existing resolvable one like you do now with the default runtimeClasspath. Otherwise you would also miss runtimeOnly dependencies and in your case platformImplementation dependencies.
java.srcDirs is a varargs method, so you donβt need to use listOf, but can directly do java.srcDirs("src", "dependencies").
But on the other hand, srcDir(...) and srcDirs(...) just add to the existing dirs. You might consider saving a second by using java.setSrcDirs(listOf("src", "dependencies")) and resources.setSrcDirs(listOf("src")), then Gradle does not have to check the other directories.
This was a naive attempt just to try and override the used dependencies to be platform specific
Sounds not too bad.
But hard to say without hands-on.
One thing that might become problematic is the architecture.
You use x86_64 for ffmpeg, but x86-64 for lwjgl.
The reason is, because you use that as part of the classifier, but that is bad, as you also use it as value for the architecture attribute. So if you would request architecture x86_64 lwjgl would not match, if you request x86-64 ffmpeg would not match.
Unless of course you would also define an attribute compatibility rule for that artifact that makes those values compatible.
But instead I would suggest you use consistent values and for the ones already present in org.gradle.nativeplatform.MachineArchitecture, the ones that are defined there.
Thatβs also the reason the documentation had the full classifier in the classifier field, even if it duplicated information.
In your variant the classifier field does not have the full classifier but you combine it with the architecture which then causes this potential problem.
I think Iβm just completely off base with my approach, any help appreciated!