Download Maven dependencies from S3 to custom folder


I’m trying to fin a way to create task which will download specified dependencies from S3 Maven repo to some custom folder. With all necessary maven XML files.

Then next step is to upload that folder to another S3 bucket (that part I’ll solved already).

Idea behind all that is than on building stage we prepare artifact to temp bucket, then after all tests (integration, performance, uta,…) we need to publish artifact from temp bucket to public bucket. but without rebuilding library again. One way is to manually pull all files, and then copy them to the publish bucket.
But is there a better way?

Usually this kind of things is done on a C. I. server.
A job builds your library, another tests it, another performs quality checks, etc etc
Jenkins for instance has nice plug-ins like to enhance an already executed build, when some criteria are met.

If you really want to do it via Gradle, you have to retrieve the files like you said, and republish them elsewhere.
You can use a artifactResolutionQuery to retrieve your artifacts, or a simple gradle Configuration coupled with custom dependencies.

Hi, I found another way, directly copying from one bucket to another.
Only demand is that one user should have read permission on bucketFrom, and write permission on bucketTo.

Here code sample…maybe will be useful to someone:

buildscript {
    repositories {
    dependencies {
        classpath ""

task copyS3(type: S3CopyBucket) {
    group = "Amazon S3"
    description = "Copy published artifacts from S3 bucket to another S3 bucket"

    doFirst {
        awsS3BucketName getPropValue('AWS_S3_BUCKET_NAME', true)
        awsS3BucketNameTo getPropValue('AWS_S3_BUCKET_NAME_RELEASE', true)
        awsAccessKey getPropValue('AWS_ACCESSKEY_RELEASE', true)
        awsSecretKey getPropValue('AWS_SECRETKEY_RELEASE', true)

import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction

import org.jets3t.service.S3Service
import org.jets3t.service.model.S3Object

class S3CopyBucket extends DefaultTask {

    def awsS3BucketName

    def awsS3BucketNameTo

    def awsAccessKey

    def awsSecretKey

    def copyFromBucketToBucket() {
        String path =".", "/") + "/" +

        AWSCredentials awsCredentials = new AWSCredentials(awsAccessKey, awsSecretKey);
        S3Service s3Service = new RestS3Service(awsCredentials); ("Copy object from bucket: {}, to bucket: {}. Project version: {}", awsS3BucketName, awsS3BucketNameTo, project.version)
        S3Object [] objects = s3Service.listObjects(awsS3BucketName, path, null)
        for(S3Object object : objects) {
            String file = object.getKey()
            String key = object.getKey()
            key = key.substring(path.length() + 1)

            if ((!key.contains("/")) || (key.contains("/") && key.startsWith(project.version))) {
                logger.debug "Copy object: {}", object.getKey()

                S3Object targetObject = new S3Object(file)
                s3Service.copyObject(awsS3BucketName, file, awsS3BucketNameTo, targetObject, true)
            } else {
                logger.debug "Exclude object: {}", object.getKey()

def getPropValue(String propertyName, boolean throwException = false)  {
    if (project.hasProperty(propertyName)) {
        logger.debug "Use Project level property"
    else if (System.getProperty(propertyName) != null) {
        logger.debug "Use System level property"
        return System.getProperty(propertyName)
    else if (System.getenv(propertyName) != null) {
        logger.debug "Use Environment level property"
        return System.getenv(propertyName)

    if (throwException) {
        throw new GradleException("Property variable '$propertyName' is not defined (on Project, System or Environment level)")

    logger.warning "Can't find property '$propertyName'"
    return null
1 Like