Create a root version catalog from other catalogs

I have a root project which consists of many sub projects(via git submodules). They are all using version catalogs in the settings.gradle.kts.
When those projects run on their own, they use their respective settings.gradle.kts
When i run all those projects in my root project, they are using the settings.gradle.kts and not their ‘local’ one.
Ideally I would like to be able to combine all the setting.gradle.kts version catalogs from all the sub projects and then only override or add versions in the root settings.gradle.kts
Is this possible?
I assume the first suggestion would be to make all my projects use a remote toml file but im not really looking to do that in this case.

i guess there is two questions

  1. Is there a way to merge version catalogs
  2. Can you merge/inherit from settings.gradle.kts version catalogs

You should never, ever, ever, ever include one project in more than one build. Is a composite build where you includeBuild the whole other builds into your build, then each build uses it’s respective version catalog as normal.

When using them as composite builds, would each project be able to depend on each other?
So in my local checkout i would have a folder full of these builds and for example, one module is a networking module, this would then need to depend on it.
Using the old way i would normally do

implementation(project(":Logging:shared"))

Yes, they can depend on each other with composite builds too.
Just not as project-dependency as they are projects in another build.
But you would use the normal coordinates at which they are created so for example implementation("my.group:logging-shared") or whatever.
If they are always coming through composite build, you can leave out the version.

Ah i see, many thanks. That makes perfect sense.

1 Like

So I am able to compile locally using the coordinates without the version number, but when i then perform a publish to maven local, the output is referencing the coordinates without the version number which means the consuming clients/projects cannot use it.

so for example:
project containing lots of composite builds/

  • com.hello.analytics
  • com.hello.networking
  • com.hello.viewmodels

And then i would have an android app that would consume the viewmodels module, which then is unable to fetch the analytics and networking.
Failed to resolve: com.hello.analytics:shared

I was expecting that when i publish to maven local that it would then be able to reference the others but maybe not.
I guess the other option is to add the android client to the mega project containing all the other modules.

I want to know what is the recommended best practice when it comes to something like this.

By default the declared versions are published, which in your case is none.
You can configure to publish resolved versions instead.
I would expect that this should produce the intended result.

All my modules are KMP so im not sure how to add this to the maven publish task.

I added

    versionMapping {
        allVariants {
            fromResolutionResult()
        }
    }

But this doesnt seem to resolve the issue, there is references in the gradle docs to do this on the java-runtime but that doesnt seem to help either.

Then just declare the version you want published in the dependency.
The included build wins anyway (by default) so for building the version is ignored, but it should then hopefully be in the published metadata.

ah ok that makes my build scripts far more simple. I guess the only issue is that if there is a mismatch of versions then it will end up using a remote version.
e.g. if Analytics develop code relies on networking 1.0.0 but develop of networking is on 1.0.1 then the remote would end up being used - this is the only caveat.

unless i change the logic of my projects to inherit the version number so it does something like.

1.0.0-mega-project as a version and it relies on all of these and inherits this version.

I guess the only issue is that if there is a mismatch of versions then it will end up using a remote version.

No, as I said, an included build always wins unless you change default configuration.

sorry maybe mis-interpretation.

networking module in develop is version 1.0.1
analytics module in develop depends on version 1.0.0 of networking because a developer never bumped the version.

When in the mega project, analytics will try and search for 1.0.0 locally first, which it wont find because the local version shows as 1.0.1, instead it will then include the remote one.

obviously if the local version was 1.0.0 then yes, it would take local over remote, but when there is a version mis-match, then it will opt for the exact version.

Unless i have misinterpreted it?

this is true when there is a duplicate match of versions, not a mismatch.

I’m sorry, but I didn’t understand what you mean.

If the dependency is present in an included build, it will always be taken, no matter what version it builds and no matter what version you depend on.

If the dependency is not present in an included build, the declared version will be used or part of normal conflict resolution.

So if you have 1.0.1 in the included build but declare a dependency on 1.0.0, then after publishing it will use 1.0.0 if there is no other part depending on 1.0.1 and it might fail at runtime.

If that is your concern, you should probably take some measures to protect against that, like an integration test that uses published versions, or some version consistency check or whatever. Hard to tell how to do it best without having the concrete situation at hand.

oh wow ok, this is the key point to take home here:

Thats fine for my use case, i actually prefer this, i think in my scenario i need to also include the client project in the mega project as a composite build, this way it will then include all the local dependancies it needs.