It’s a super long story on what I am trying to do, but I’ll try to give you the background. Today, we have a bunch of zip files. Around 120 or so. The content, filenames, number of artifacts, etc changes all the time. Right now they just sit on an NFS storage area. They are patches delivered by scores of different product teams in my company. We deliver these zip files to our customers. What I want to do is move them from this storage area and into artifactory. I want to take advantage of the ability to stage them into different phases of a release pipeline.
So the gradle build I am playing around with has no compilation or test step. I simply want to crawl this storage area, pick up all the zip files, and get them into Artifactory.
The on the consuming side, I want to be able to retrieve all the zip files and do certain actions on them. This is why I need to know where they get put when they are retrieved.
I would love to take advantage of the buildInfo that artifactory tracks, so that I can know which build copied which zips into artifactory. I don’t seem to get this when using the REST interface. And I am unable to really get any other mechanism going since there is no maven or ivy build here. There are arbitrary artifacts of ever changing size and name. Is there a way to dynamically add them to a configuration and then publish a configuration?
I also was thinking that I could create a “global” pom that tracks all the uploads I did. Then the consumer could just declare a dependency on the signle “global” pom. Since the consumer has no idea on the size, number of files, or names of the files, I figured this was the only way to go.