Understanding Transitive Package Management

(Christopher Dieringer) #1

Hi all:

Having come from the JS world, I’m having difficulty understanding how to best scaffold up some interconnected projects using gradle.

Suppose I have project B, who is nothing other than some google protobuf definitions. I have another project, C, who is also some protobuf definitions. B depends on C. Suppose that another package, A, consumes B. Thus, A transitively consumes C. Note, each of these projects lives in separate repositories.

How to package/publish B is a bit of mystery to me.

  • I can bundle C’s source (.proto files) into B. However, this is nasty as I’ve just duplicated C’s code in B. The package payload is higher, and my ability to dynamically satisfy C’s version on a version string greatly reduced, such as if A also has a direct dependency on C.
  • I can just ship B as a standalone, but now when A runs, everything blows up because A has no reference to C. Thus, it’s now up to the consumer to traverse its dependencies’ dependencies, add them into A, which is… a headache.

In NPM land, I didn’t publish necessarily a runnable artifact (e.g. a .jar), but instead, a npm digestable package. That package defines it’s deps, and on-install, the package manager recursively fetches all of it’s dependencies recursively.

It would seem that this is something gradle could/should attempt to support as well? I don’t publish an artifact, but rather, a gradle project. My npm package.json ==> build.gradle. If all I can publish is an artifact, complexity remains high as using a dependency manager hasn’t given me much beyond a download and declarative versioning tool.

this says “Manage transitive dependencies,” but I have no idea how for the above situation. Would love some insight from the gradle wizards out there :smile: :crystal_ball:.

(Stefan Oehme) #2

You just publish B / C as jars/zips/whatever format suits your needs. You let B depend on C. Gradle will include this dependency information in the generated descriptors (e.g. pom.xml when publishing to a Maven repository). Now, when A depends on B, Gradle will automatically resolve C too.

(Christopher Dieringer) #3

Hey @st_oehme, thanks so much for sharing! That’s great news.

What I’m still left wondering, however, if there is a way to achieve the following. consider:

  • C has c.proto as an artifact
  • B has b.proto as an artifact
    • b.proto has a statement of import "c.proto"
    • ^^ assumes b.proto and c.proto live next to each other when it’s time to compile the proto files

if A is to compile b.proto successfully, i still need some knowledge of how both B & C’s artifacts are structured. even if installing B downloads c.proto and put its contents somewhere, I haven’t really solved my problem–B as a dependency of A isn’t in the state I need it to be in. it’s as though I want to specify a post-install hook in B that tells it to grab C’s source and merge it with it’s own. one could say that A could be responsible for that instead, but again, now A has insider knowledge on B’s setup. A should be able to say I need B, and have B ready-to-go. “ready-to-go” is subjective, of course, because the artifact I’m publishing in this example, is in fact just b.proto–intentionally!

(Stig Kleppe-Jørgensen) #4

Hi Christopher,

There is an easy way around this: use the protobuf gradle plugin as it handles this case as mentioned here: https://github.com/google/protobuf-gradle-plugin#protos-in-dependencies.

Without the plugin, I would have added some steps to B’s build logic:

  • Unpack C into B’s build directory
  • Add this path to the protobuf compiler for the imports
  • Build B’s protos, but used C’s protos only for the import, ie. not compile them

(Christopher Dieringer) #5

hi @stigkj1, ya, that would be great! unfortunately, however, I’m working w/ a c# project, not a java project. consequently, the plugin won’t cut it as it hardwired to java. even if they’ve rev’d to support proto v3, i’m stuck w/ proto v2 for a while.

perhaps the takeaway from this is that 1) gradle can do transitive deps, and 2) there’s simply not automated tooling available at this point in time for our tech stack.