Support for native projects

not-a-bug

(Shahji) #1

Hello there,

Looking for some clarification about gradle’s support for building native projects. I see two independent plugins and it isn’t clear which one is “supported” i.e. forward looking and intended for use when.

  1. https://docs.gradle.org/current/userguide/native_software.html
  • I see this one the official website with lots of documentation on how to do things.
  1. https://blog.gradle.org/introducing-the-new-cpp-plugins
  • The only thing I found about this was the blog
  • Lack of documentation on doing even the most basic things (except for sample projects that attempt to cover all possible use cases)

Some official clarification on which of these is intended for use when. A link to official/unofficial documentation about the latter one would greatly help since this one looks more promising (and feature full) then the former.

Appreciate your time!!


(Shahji) #2

@Daniel_L Looks like you are very much involved with the native support within Gradle. Care to comment here!!


(Daniel Lacasse) #3

Not a problem @Shahji, I can comment on this. The new plugin mentioned in the blog post is the direction we are moving forward. The major difference is how the native model is configured compared to the documentation you found in the user guide (current model vs software model). The reason for the change of configuration model is explained in this blog post. In a nutshell, we are extensively trying to converge all configuration model so each improvement in term of performance, usability, etc. can be done across software domains.

At the moment, the new C++ plugins aren’t ready as a full replacement of the software model one. That been said, they are very close and we hope to finalize the last polishing to officially release them and start documenting a migration path between the two configuration model. The only documentation for the new C++ plugins is provided through samples.

Don’t hesitate to try them out and give us some feedback.


(Shahji) #4

Appreciate the reply @Daniel_L

I am planning on using it for a big project - roughly 100+ subprojects. I am still in very early stages of evaluation so I have time but I will take any help I can get to start the engines. To begin with, I tried using it on a small project - compiling Lua - and that didn’t go so well. I tried and followed the examples you referred to and I didn’t see any obvious discrepancies. Here are my files followed by specific issues I ran into -

Core organization:

  • Lua
    • 5.1.1.8
      • Lua
        • *.c
        • *.h
      • build.gradle
    • build.gradle
    • settings.gradle

settings.gradle:

include ‘:5.1.1.8’
rootProject.name = ‘Lua’

Root build.gradle:

allprojects {
    project.group = ‘org.lunar’
    project.version = project.name
}

Project build.gradle:

apply plugin: ‘cpp-library’

library {
    linkage = [ Linkage.STATIC ]

    source.from project.fileTree(dir: ‘Lua’, include: ‘**/*.c’)
    publicHeaders.from project.file(‘Lua’)
}

publishing {
    repositories {
        flatDir {
            name “fileRepo”
            dirs “F:/BBS01/repository”
        }
    }
}

Issues -

  • When I do “gradle build” the process starts and terminates in few seconds. I see build folders got created but no object files and no output library. I had to comment out the publishing block to even run because I am not including the maven-publish plugin.
  • I don’t see any examples of publishing generated static library to flatDir repository. Is this even supported currently?
  • When publishing how do I configure the manifest?

Appreciate your time!!


(Daniel Lacasse) #5

Unfortunately, configuring the source through the library directly is currently broken. Those are part of the set of polishing that needs to take place before announcing the new plugins. We hope to get this done soon.

That been said it doesn’t mean it’s not possible. I suggest you have a look at the following sample. It should show the basic on how to compile a C library.

To use the publishing block, you need to apply the maven-publish plugin as it adds the required extension to the project.

flatDir repository is not supported with the native plugins. You need to use a maven repository in a local folder. See this sample to see how to do it.

At the moment, there isn’t any way to configure the publishing manifest itself. This feature is unlikely to ever exists simply because we want to strongly model any modification that is required to be done to avoid mismatch failures in Gradle. You can add attributes to the project’s configurations which will take part in the manifest and dependency resolution. See the JavaDoc for [AttributeContainer](https://docs.gradle.org/current/javadoc/org/gradle/api/attributes/AttributeContainer.html#attribute-org.gradle.api.attributes.Attribute-T-) and the following example:

configurations.someConf.attributes {
    attribute(Attribute.of("some-attribute", String), "some-value")
}

(Shahji) #6

Appreciate the prompt follow up.

Few follow up questions -

  • We have code auto-generators and I am trying to plug those in. How do I add a pre-step to each compilation unit i.e. cpp file. Any examples? i am guessing this would require some sort of plugin in buildSrc folder?

  • Any examples of api only i.e. headers only, published package?

  • Any examples of prebuilt static library package i.e. one for which we have no source code but just lib and headers.


(Shahji) #7
  • How do I set the “targetPlatform” for the library? Also, is there a global way of setting supported platforms per operating system?
  • How do I add new target variants - debug, release, optimized, performance etc?
  • How do I tweak the compiler flags for each of those target variants?

(Daniel Lacasse) #8

Hi @Shahji,

Thanks for the interest in the new plugins. Here are the answers to your questions:

  • We have code auto-generators and I am trying to plug those in. How do I add a pre-step to each compilation unit i.e. cpp file. Any examples? i am guessing this would require some sort of plugin in buildSrc folder?

At the moment, we treat a set of compilation unit together. Would you mind expanding what use case you are trying to achieve with this pre-step per-compilation unit? You could certainly solve this by inserting a transform task that would process each generated compilation unit and generate another source set which is used by the compilation task. It would basically be something like:
source generation task > source transformation task > compile task

That been said, we would love to hear the use case so we can model it or account for it in Gradle.

  • Any examples of api only i.e. headers only, published package?

Although there is currently an issue with header only library you can work around it at the moment. We don’t have a sample for this yet.

  • Any examples of prebuilt static library package i.e. one for which we have no source code but just lib and headers.

We don’t have a sample for this particular use case. However, the structure should be quite similar to consuming prebuilt shared library.

  • How do I set the “targetPlatform” for the library? Also, is there a global way of setting supported platforms per operating system?

This concept is coming soon. We are looking at modeling this much deeper with the new plugins. At the moment, we allow declaring supported operating system. Soon we will start the work for architecture. Those are definitely on the roadmap.

  • How do I add new target variants - debug, release, optimized, performance etc?

Contrary to how the configuration was done with the software model, we are looking at providing a better extension model to allow build author de model what those variants really means for them.

  • How do I tweak the compiler flags for each of those target variants?

You can achieve this using the binary container on the application or library component. This sample shows an example that you can try.

Don’t hesitate to ask more questions,

Daniel


(Shahji) #9

Hello @Daniel_L

Appreciate the follow up. I am starting to get a hang of how to do things in gradle. Still stumbling into a lot of stuff that looks like isn’t just there yet. Planning on setting up a local Artifactory Maven Repository to implement publishing.

We use Qt in our project and one of the features it includes is autogenerating some connection logic code using a tool named moc. It basically looks at some macros in the input source/header and generates a new cpp file which now needs to be compiled and linked. We have some changes in the process from the default out-of-box implementation in Qt. We won’t need changing the sourceSets as we instead explicitly include the generated file into the cpp. So as long as the path is included in the publicHeaders/privateHeaders this should work. We just need a way to insert the moc task before the file gets compiled. We could do it as a dependency on CppCompile.class but that won’t be incremental i.e. do it only for the cpp that have changed. What I have currently and seems to work but non-incremental. QtMocGeneratorTask blindly generates a moc file for all cpps included in the sourceSet.

tasks.withType(CppCompile.class) {
    dependsOn QtMocGeneratorTask
}

This is less useful at the moment without actually being able to define/differentiate between debug, release, profile, performance, etc. I haven’t gotten to the point to make it work across operating systems just yet.

A few more follow up questions -

  • How do define multiple libraries/application outputs from single build.gradle subproject file?
  • The output libraries/dlls are named based on the project name, I guess. How do I change that? How would I do that for build files that generate multiple libraries/dlls?
  • What’s the easiest way to globally change the output directory i.e. the “build” for all subprojects to be under one root? At the moment I am using this in each subproject but I would preferably like to do it in one place somewhere at the top level.

buildDir = rootProject.projectDir + “/” + java.nio.file.Paths.get(rootProject.projectDir.toURI()).relativize(java.nio.file.Paths.get(project.projectDir.toURI())) + “/” + project.version

  • How do I add a new tool chain?
  • Any examples of how to configure (path to exe, libraries to include, paths to system libraries, paths to system includes, etc) included tool chains?
  • Any examples of using resource compiler for windows UI based binaries?
  • I have global macros defined for all macros in root project’s build.gradle so they are all in one place. I have an exception for one specific project where I have to remove the already defined macro. How do I remove from macros list?

(Shahji) #10

Hey @Daniel_L

I have an immediate blocker to make progress. Wondering if you could help!

Could you provide a few pointers on how to publish a package (includes headers and libraries) using pre-built binaries?

Thanks!


(Shahji) #11

How do I include non .h includes in publicHeaders for publishing?

I tried this and that didn’t work.

publicHeaders.from project.fileTree(dir: ‘includes’, include: ‘**/*.inl’)


(Mate Molnar) #12

Hi Shahji,

What I came up with:

  1. define headers and binaries paths
  2. generate headers zip (zip task)
  3. generate module file
  4. publish headers, module file and libraries to maven repo (maven publish)

After publishing I can consume these dependencies the standard way.

The problem I have is that I have to purge the gradle cache each time I update a package.

I am also looking for information about the preferred way.


(Shahji) #13

Is this defining a new artifact or extending the default cpp-headers artifact? I would guess there would be a way to extend existing artifact. Some script example would help!


(Mate Molnar) #14

Since I’m not building any libraries, it’s a new artifact.
I’m not using any plugin besides maven-publish.
Basically I’m zipping all headers, creating the metadata file and publishing it.

I’ve asked around about creating the module metadata file via an api multiple times, but haven’t got an answer. What I have now is a hashmap with all information and I use groovy’s JsonBuilder class to create the json file. It’s hacky and needs changes between Gradle versions.

I know I’m on the wrong track with this, but I was not able to make it work any other way.


(Shahji) #15

Do you have an example that I can work off of? What’s needed is a ‘cpp-api’ plugin just like ‘cpp-library’ and ‘cpp-application’.


(Shahji) #16

@Daniel_L or one of the other core developers chime in on this. Thanks!!


(Daniel Lacasse) #17

Hi @Shahji and @matemoln,

Still stumbling into a lot of stuff that looks like isn’t just there yet.

The goal for the new native plugins is to behave similarly to the other language domain to allow for easy cross-domain build improvement from Gradle as well as 3rd party plugin. This will require new APIs which will take more times. Right now, we went the route of don’t the native work with the goal to replace the old plugin soonish. For example, integration with 3rd party plugin such as Artifactory Maven there are not tested and will most likely not work out-of-the-box. It will eventually work, but the focus isn’t on this now.

It is important to make the distinction between what doesn’t work with Gradle and what isn’t provided out-of-the-box. In general, there’s only a tiny fraction of things that doesn’t work with Gradle. Gradle’s richness allows pretty much anything to work, it may just be cumbersome to implement.

As an example, for a project, I was able to make the new native plugins cross-compile for bare metal ARM processors while provisioning both host and cross-compiler as shown in the samples. Some work had to be done, but as we move forward more and more will be handled by Gradle out of the box.

We use Qt in our project and one of the features it includes is autogenerating some connection logic code using a tool named moc.

I would advise against adding a dependency to your QtMocGeneratorTask from CppCompile. It’s too fragile. You should always prefer strongly modeling what you are trying to achieve. I suggest having a look at the source generation sample for an example on how to do this. Be advised there is a bug around consuming library with generated source files. You can inconveniently work around it by explicitly adding a task dependency on the source generated task in the producer project from the consumer project.

This is less useful at the moment without actually being able to define/differentiate between debug, release, profile, performance, etc.

I agree with you on this. We are lacking the concept of build type and are discussing how this should be migrated over from the old plugin. We want something more strongly modeled and haven’t achieved a consensus.

How do define multiple libraries/application outputs from single build.gradle subproject file?

As a design choice, the new plugins can only configure a single component per project. We may revisit this choice later but in a cross-cutting way with the JVM domain. That been said, it doesn’t mean you can’t share a single folder layout to constraint multiple libraries/applications. I suggest you have a look at the Swift package manager sample that share a single source folder across multiple projects thus creating multiple libraries/applications outputs from a single build.gradle.

The output libraries/dlls are named based on the project name, I guess. How do I change that? How would I do that for build files that generate multiple libraries/dlls?

By default, output file name is based on the subproject name. To change this, you need to set the component’s base name like this:

applications {
    baseName = "some-app-name"
}

What’s the easiest way to globally change the output directory i.e. the “build” for all subprojects to be under one root? At the moment I am using this in each subproject but I would preferably like to do it in one place somewhere at the top level.

You can apply configuration to all projects using something like the following in your root build.gradle:

allproject {
    buildDir = ...
}

I suggest you read the Authoring Multi-project Build chapter of the userguide.

How do I add a new tool chain?

The new plugins are still piggybacking on the software model configuration for the tool chains. See this section of the native chapter of the userguide.

Any examples of how to configure (path to exe, libraries to include, paths to system libraries, paths to system includes, etc) included tool chains?

There are several ways to achieve those specific configurations and configuring the tool chains may not always be solutions. I suggest having a look at the user guide section about the tool chains and tell us about your specific use case.

Any examples of using resource compiler for windows UI based binaries?

Unfortunately, this hasn’t been migrated over the new native plugin. It would need to be wired in like any source generation use case.

I have global macros defined for all macros in root project’s build.gradle so they are all in one place. I have an exception for one specific project where I have to remove the already defined macro. How do I remove from macros list?

At the moment, you would have to achieve this by reaching to the CppCompile tasks. However, this has lots of edge cases and is prone to have configuration order issues. It is best to model your macros as either 1) a separate extension or 2) opt-in configuration by calling methods. Option 1 may be overkill but would behave just like you are describing. The advantage is you can model what each macro means. Option 2 is a lightweight configuration scheme that each project opt-in. It’s more explicit and lacks the removing capability. Both options aren’t mutually exclusive.

Could you provide a few pointers on how to publish a package (includes headers and libraries) using pre-built binaries?

What you are looking is achieved inside the CMake library sample. More specifically, you would be looking at this part of the build wrapper plugin. The CMake build generate binaries that are considered as prebuilt libraries by Gradle. We have started the story around prebuilt libraries yet and would be curious to know more information about your use cases, more specifically:

  • What library are you using?
  • Are they built by 3rd party build tools? If yes, which build tools is used?
  • Where do the binaries come from? Chocolatey, apt-get, conan.io, brew, plain installer, etc?
  • Do you require only released binaries or do you expect to build the latest modification?

How do I include non .h includes in publicHeaders for publishing?

The publishing of the headers is handled by a Zip task. You would have to reach to that task and add your source files there. What use case do you have around adding non .h includes to be published? If those are “include” files to be ingested by another tool than the compiler, it would be better to publish them separately. The reason being this artifact will be able to be resolved independently of the headers which are especially important if the other tools don’t require the headers to function properly. For example, the protobuf .proto files should be published as another artifact.

What I came up with:

  1. define headers and binaries paths
  2. generate headers zip (zip task)
  3. generate module file
  4. publish headers, module file and libraries to maven repo (maven publish)

@matemoln This is a great, but fragile process of publishing prebuilt libraries. Although we document the module file format, manually generating it is bad practice. We always recommend modeling what you are trying to achieve through the Gradle public API. I suggest following a similar process as shown in the build wrapper plugin.

The problem I have is that I have to purge the gradle cache each time I update a package.

@matemoln Depending on how you publish/consume the library, you may be hitting the changing cache resolution strategy if you are using dynamic version range, or breaking the assumption that a released version should never change once published. Either way, you should be able to force the artifact to be fetched again by using the CLI flag --refresh-dependencies. Purging Gradle cache should be avoided as it’s location and implementation is internal to Gradle and may change (unlikely, but we don’t want users to purge the cache manually).

I would guess there would be a way to extend existing artifact.

The artifact publication is still quite internal, but we aim to provide some public APIs for extension through plugins.

I’ve asked around about creating the module metadata file via an api multiple times, but haven’t got an answer.

The module metadata is in full development which makes answering questions hard. We should have at least answer something and, for that, I’m sorry for not providing the right guidance. It will be documented better soon-ish.

Do you have an example that I can work off of? What’s needed is a ‘cpp-api’ plugin just like ‘cpp-library’ and ‘cpp-application’.

You should be able to find your answer in the samples referenced above.

Thanks for all those questions, we always welcome community feedback and we are open to all use cases you are using in your native build. Don’t hesitate to ask more questions,

Daniel


(Shahji) #18

Hello @Daniel_L,

Appreciate the response and apologies for the lack of communication on my part. There was a little too much to digest from your last response and took a little longer. :slight_smile:

QtMockGeneratorTask is supposed to run only on files that have changed. If not depend on CppCompile then how do I achieve this relation?

This is a serious limitation, especially when porting existing big codebases. We have lots of small modules that generate multiple smaller libraries and also generate one big shared library or application. The internal smaller modules (which are usually static libraries) can be used as external dependencies. A simple example is like a plugin architecture - the headers defining the interface for writing plugin is used by our own plugins but also build as a separate library for anyone else to use it.

We have a whole slew of library dependencies. We have extensive variety in what’s “prebuilt”. In fact, we wish to publish everything we do also as prebuilt libraries. Depending on the platform and where the library comes from the tool used to build itself can be different. We have some built using make, some Nant, cmake, visual studio, xcode packages, waf, ninja, .Net assemblies, and may be more. Many of these we don’t ever intend to build. Some of these we have don’t even have source code. These are prebuilt, prepackaged third party libraries. I don’t think we should focus on where or how these are built. That’s a deep rabbit hole. Focus should be we have cases where we need to package and publish any combination of headers, libs and dlls. And I mean any permutation of these - including only headers (rapidxml, rapidjson, boost), only libs (testing frameworks, partial libs), and only dlls (directx runtime).

P.S. I will open a separate discussion to discuss publication pipeline. Will include specific examples wherever possible.

Inline implementation of functions are typically (not a standard but a good practice) in a separate file which gets included at the end of the header file explicitly. The extension used is typically ‘.inl’. Regardless there are other third party libraries we use which use .hpp and .hxx when they want to call out private headers which are still included from the public headers but are not intended to be used outside the package itself. We have other use cases as well with library specific file extensions with proprietary implementation.

Attaching code I wrangled together using the referenced project to post prebuilt library. Comments/corrections are welcome

Prebuilt.zip (4.2 KB)

I tried to support the case of “headers only” scenario which I managed to publish but failed to resolved dependencies when referenced.

More questions below -

  • How to refer to “headers only” dependency in other projects - using api or implementation or something else? I got it working with using api for libraries projects but the same won’t work for applications For application, use of api dependency throws errors.

library {
    dependencies {
        api “com.xyz:rapidxml:dev” // This works
    }
}

application {
    dependencies {
        api “com.xyz:rapidxml:dev” // This doesn’t work
    }
}

  • Are macros exported from libraries, for both prebuilt and built? When a library is built there are certain macros which are public and are required to be included by everybody who references the library as a dependency. Is this something that’s supported and if not how can I workaround to get this working? Note that the dependency can be a published package and so these public macros will have to be somehow be included in the generated manifest or something. This should be part of transitive dependency support.
  • Any option to publish project source as a package?
  • Current publication pipeline publishes 3 binaries for each module - headers , debug and release (or in future, headers and a binary for each variant). Any reason to split these into so many pieces. Why not just publish a single binary with all in it. In an ideal world, I would like to define what the published binary should include.
  • For transitive dependencies, how do I add additional include paths. For instance, for the following hierarchy -

include
  MyModule
    PlatformA
      MyModule
        PlatformAHeader.h
      MyModuleHeader.h

I declare the include folder as ‘publicHeaders’ and that would resolve dependencies like “MyModule/MyModuleHeader.h” but not “MyModule/PlatformAHeader.h”. The files however are there in the folder since calling put include (the root) as the publicHeader would include everything below. How do I include include/MyModule/PlatformA as another exported header path without actually having a second copy of the files in the packaged archive.

One approach is to exclude the folder from the root and explicitly include the subfolder. That basically flattens the hierarchy. I would rather not do that because there is a risk of overwriting files in that scenario if the subfolder names happen to be the same.

  • More in detailed question to follow in a new thread in context to publishing.

Publishing using the new native plugin
(Shahji) #19

Can I get a comment from core native support team. I am too far in to bail on this effort. I would like to take this to finish line and I have a few blockers at the moment.


(Daniel Lacasse) #20

My bad for the late reply @Shahji, I didn’t want to rush a response to this excellent discussion.

QtMockGeneratorTask is supposed to run only on files that have changed. If not depend on CppCompile then how do I achieve this relation?

With the native, we are trying as much as possible to wire model together implicitly as you would normally expect. For example, a FileCollection can have builtBy tasks. When you pass around the FileCollection the task dependencies follows without having to wire everything together manually. In the case of the source generation sample, the DirectoryProperty held on the generator task contains the task dependency of who is generating the Directory. Passing the property to the CppCompile sources will wire the dependency automatically. It’s not wrong to wire the CppCompile task to the QtMockGeneratorTask. It’s better to configure the model with the task dependency instead of manually wiring everything.

the new plugins can only configure a single component per project

It’s a mostly a simplification we decided on as opposed to how the software model worked. There’s a distinction between a single component and a single variant. For the same code (component), you may have multiple variants. Debug/release are a set of variants. We are looking at helping the developer model variants in a much richer way for each component. For example, a complex software like vim has several components (GUI application, CLI application, and libraries), however, if you take the application component (the actual vim executable), you have several variants. Each of these variants can be depended on or publish individually. It’s also important to note that the Gradle logic structure of your project doesn’t need to follow the on-disk source layout.

That been said, we are likely to allow multiple components per project at a later point. We want to postpone adding such feature just because we feel that most multi-component usages have been a replacement for our poor support for variants. Instead, we will focus on first-class variant support then evaluate the specific use case for multiple components.

This is a serious limitation, especially when porting existing big codebases.

I’m assuming here you are talking about porting big existing codebases from the Software Model plugins? You should still be able to achieve what you want, it may not be convenient though.

In fact, we wish to publish everything we do also as prebuilt libraries. Depending on the platform and where the library comes from the tool used to build itself can be different.

I agree with you that with such a polyglot build infrastructure, it’s best to have each build take the responsibility to publish the binaries to a repository. In a perfect world, Gradle would understand any binary repository so you can use the easiest repository to publish in each build system and have Gradle consume from all of them using his powerful dependency engine. However, we are far from this goal. Generation the metadata file manually may be an interesting idea in this case however you have to understand that we don’t support this use case and will be left on your own if something breaks between version. An alternative would be to package a tiny Gradle build use to delegate the publishing to from the other build system (see your Prebuilt.zip code)

Inline implementation of functions are typically (not a standard but a good practice) in a separate file which gets included at the end of the header file explicitly. The extension used is typically ‘.inl’. Regardless there are other third party libraries we use which use .hpp and .hxx when they want to call out private headers which are still included from the public headers but are not intended to be used outside the package itself.

Those are good reasons to have non .h includes to be published. I opened this issue to create a sample demonstrating how this can be done now. The convenience will be worked on at a later time.

Attaching code I wrangled together using the referenced project to post prebuilt library. Comments/corrections are welcome

Your code for the prebuild library and publishing them is pretty much on the spot with what is currently available.

I tried to support the case of “headers only” scenario which I managed to publish but failed to resolved dependencies when referenced.

Your error for the application component is simply based on the fact that an application cannot be depended on so api dependencies make no sense. You should use implementation for the application component. I’m not sure if this will solve your problem, you can try the following workaround if you still have issues.

The “headers only” scenario is something we need to work on fixing sooner as the workaround is pretty hard to grasp. There are two ways to work around it:

  1. If you are building your code, you can generate a dummy source file with a symbol that is never used. Using the right flag with GCC and Clang will discard the symbols from the executable binaries (shared or executable) as the symbols aren’t referenced anywhere.
  2. Use the cppCompileDebug and cppCompileRelease configuration in the Project#dependencies DSL. The dependencies DSL under the library and application allow only api and implementation which are forwarded to the Project#dependencies DSL. Something like this would add a dependency only for the headers:
dependencies {
    cppCompileDebug "com.xyz:rapidxml:dev"
    cppCompileRelease "com.xyz:rapidxml:dev"
}

Are macros exported from libraries, for both prebuilt and built?

This is something we will need to support. At the moment, we don’t have a sample to show how this can be achieved, but here is how I would do this.

  • Create a task for each variant that will generate the exported macro manifest. Each task should be configured with the macros required to be exported. The input of this task would be the macros and anything that is related to a variant. The output should be a file containing the macros values (could be just a simple command line snippet representing the macros, i.e. -Dmacro1=value1 -Dmacro2 .... This result should be exposed via a getter.
  • Wire the generated command line snippet to the right compile task. You don’t need to add a dependency. You only depend on the configuration of the task when you are in the same build.
  • Add a resolve and consume configuration for the exported macro for each variant. Make sure you add set the usage as something like exported macros. One will be wire to the task that generates the file containing the command line snippet of the exported macros. The other configuration will be used to consume the other exported macro files.
  • Wire the consume configuration to the right compile task. You will need to add a dependency on the configuration from the compile task in case you use composite build so the task dependencies are correctly wired in.
  • Add an artifact to publish command line snippet files together with the other artifacts.

All these steps can be abstracted out inside a plugin. It would be nice if you could contribute a sample that demonstrates this. We can help you in the right direction for this. Use the issue for this sample to continue the discussion on this.

Any option to publish project source as a package?

This is also something we will want to add especially for improving the IDE support. Unfortunately, we don’t have a sample that shows how to do this. At the moment, you would have to wire this manually by creating a configuration with a source usage for example and attach a Zip task that compresses the entire source of the component.

Current publication pipeline publishes 3 binaries for each module - headers, debug and release (or in future, headers and a binary for each variant). Any reason to split these into so many pieces. Why not just publish a single binary with all in it.

As the number of variant increase, it’s easier and better to resolve only what you need. The number of published binaries adds no burden on Gradle. It also allows you to swap module when “hacking” dependencies using substitution rules with Gradle dependency engine.

In an ideal world, I would like to define what the published binary should include.

It’s important to understand that how Gradle publish binaries is just a suggestion. You could publish everything in another way, a single archive, a resolve this instead from Gradle or from other builds. It’s just the most flexible ways for Gradle to publish separated artifacts with regards to forward compatibility.

For transitive dependencies, how do I add additional include paths. For instance, for the following hierarchy

At the moment, we limit the number of public header root to one per component. For multiple public header roots, there are considerations to think about. The most important one being is the additional header root part of the same component or different component but colocated inside the same on-disk layout? This is often an issue with native software development where developers use a single include directory for all components which will hide dependencies between components. In your case, it looks like you may want something like you needs variant specific headers.

I hope this answers all your questions, don’t hesitate to ask more question or clarification on what was discussed here.