Each customer (cat, dog, mouse) needs to include the source code from certain sub-projects from animal.
I’m guessing what you mean here is that customer can have a binary dependency on the animal sub-projects (ie a dependency on the classes/jar rather than sources) is this correct?
Each customer (cat, dog, mouse) needs to be able to override certain classes from animal
The normal pattern to do this is to have an interface (or base class) in the animal project which has a different implementation in each customer project (ie each with a different class name). Is this what you mean? Are you familiar with dependency injection / inversion of control (IOC)?
Lance,
Thanks for the response. I am familiar with DI/IoC. I am trying to visualize this using sub-projects for each customer. The caveat is, not every customer will have changes to the same class.
I probably need to restate what I am looking for with a better explanation. The example I gave above was just something I saw from another person’s question on the link provided. I will ask differently based on a couple of places I worked.
Let’s say I have an application that is considered the core product. This core product is purchased by several customers. However, some customers need custom changes, but not all. Each custom change is different.
Make a copy of the core product for each customer and customize. This is easy to do on front end, but terrible for upgrades.
The core product had a customer folder which contained each customer and their custom code. If the core product had a class name BatchProcessing and there was a customer called ACME that needed that class customized, the class would be named ACMEBatchProcess that extends BatchProcessing. When a change was made to the core product, there was no need to make it for each customer. Only if an issue occurred during each build/unit/integration testing was it addressed. Upgrades were very easy. See diagram below.
Both of these approaches work, especially if the core code is sold as a whole to each customer.
Now, this is where my question begins. Let’s say we need to break the core product up into separate products, because the code is way too big. also, not each customer is using everything, which would allow us to sell the products separately. The approaches are:
Separate core products/application. This would mean maintaining custom code on separate products, multiple deployments.
Create sub-projects. The build for each customer would contain the sub-projects to include.
So, what I am looking for is the approach of using sub-projects, but also be able to customize the code per customer.
@Configuration
public class BatchCommonConfiguration {
@ConditionalOnMissingBean
public BatchProcessor batchProcessor() { return new DefaultBatchProcessor(); }
}
@Configuration
public class AcmeConfiguration {
// this will override the BatchProcessor from batch-common
public BatchProcessor batchProcessor() { return new AcmeBatchProcessor(); }
}
For clients that don’t specify a BatchProcessor, the DefaultBatchProcessor will be used
I see where you are going, but taking this approach is not a good option for me. If there was only one line of code that needed to be added to ACMEBatchProcessor, I would basically be copying the DefaultBatchProcessor just to make a single change.
Some customers will have minor changes which include an additional attribute in entities that need to be changed in various classes throughout the core application. Two different companies I worked for in the past used different approaches:
Copy the core application and customize it for each customer, which is a pain for upgrades.
Under the core src to have a customer folder, where there are src folders for each customer. For each customer, the classes that need changing are extended. Upgrades were easier.
I was looking for a different approach to this, since each core application could be huge. I wanted to break it apart into smaller projects, yet be able to customize.
If there was only one line of code that needed to be added to ACMEBatchProcessor, I would basically be copying the DefaultBatchProcessor just to make a single change.
I disagree, ACMEBatchProcessor could extend DefaultBatchProcessor and override a method with slightly different logic. Ultimately it could contain a single line of logic with zero duplication.
Another option is to have all the logic in DefaultBatchProcessor and you just configure it differently in BatchCommonConfiguration and AcmeConfiguration (different constructor parameters)
Another option is to use composition, and extract the custom logic to a single method interface (eg BatchFeeCalculator) which has its own “default” and “acme” implementations.
Option 1: Use a single entity type for all customers, the entity has nullable fields and custom validators/handlers for each customer. Have different configuration to show/hide the fields per customer in the UI.
Option 2: Use generics in the services and have custom implementations of entities per customer
public class BatchEntry {
public long getId() { ... }
public BigDecimal getAmount() { ... }
}
public class AcmeBatchEntry extends BatchEntry {
public BigDecimal getSomeExtraField() { ... }
}
public class DefaultBatchProcessor<T super BatchEntry> {
public BigDecimal processBatch(List<T> batch) {
return batch.stream()
.map(entry -> entry.getAmount().add(calculateFee(entry))
.reduce(BigDecimal.ZERO, BigDecimal::add);
}
protected BigDecimal calculateFee(T entry) {
return BigDecimal.ZERO;
}
}
public class AcmeBatchProcessor extends DefaultBatchProcessor<AcmeBatchEntry> {
@Override
protected BigDecimal calculateFee(AcmeBatchEntry entry) {
return entry.getSomeExtraField().multiply(entry.getAmount());
}
}
Option 1: Been there, done that. It only makes it harder when a change a few customers want is pushed to the core product. Then you have to determine which customers have that change and what field they use. Then determine what the other customers use the field for.
Option 2: I thought about generics. But, can this be handled in multi-projects as described in my original question? Where the DefaultBatchProcessor class is in the core project and AcmeBatchProcessor class in the Acme project?
can this be handled in multi-projects as described in my original question? Where the DefaultBatchProcessor class is in the core project and AcmeBatchProcessor class in the Acme project
I’d need to understand your use case better, but it should work. I feel with generics and interfaces you can solve this problem.
If you still want to do your original idea, where sources from one project are used in another if overrides do not exist, this should also be possible. But I feel it’s a bit of a hack.
Eg: build.gradle
apply plugin: 'java'
// this task copies java sources from the :base project to $buildDir/copyJava
// for any file where there is NOT an override in the current project
task copyJava(type: Copy) {
ext.outputDir = "$buildDir/copyJava"
from project(':base').sourceSets.main.java
into ext.outputDir
eachFile { FileCopyDetails action ->
String path = action.relativePath.pathString
boolean overridden = file("src/main/java/$path").exists()
if (overridden) action.exclude()
}
}
// wire the copyJava task into the DAG
compileJava.dependsOn 'copyJava'
// add the copied source directory to sourceSets.main.java
sourceSets.main.java.srcDir copyJava.outputDir
If you’re going to do this in more than one place it would be worth refactoring to a plugin under buildSrc.
This is starting to sound a lot like my monkey patch plugin which is also a hack!