Is it possible to edit a file while it is in-flight as part of a copy?

I’m currently writing up a build file which is a step in our overall continuous delivery pipeline, in a nutshell some tar files will get thrown into a directory and from there we need to trigger a job to make a bunch of directory structure changes and edit some of the files.

My problem is that while I can get the reorganising of files to work fine using something like the following:

    if (!project.hasProperty("environmentkey")) {
      ext.environmentkey = "NONE"
    }
    task fileTree(dir: 'input').include('*.tar').each { tarFile ->
      def name = tarFile.name - '.tar'
      copy {
        from tarTree("input/${tarFile.name}")
        into "build/${name}"
      }
      copy {
        from "build/$name/shared"
        into "build/workspace/$name/$environmentkey/import/definition"
        include '**/*.objc'
      }
      copy {
        from "build/$name/createComms"
        into "build/package/$name/$environmentkey/import/creation"
        include '**/*.cmd'
      }
      copy {
        from "build/$name/disk"
        into "build/workspace/$name/$environmentkey/import/files"
      }
...
etc.

I know that using methods for the copying isn’t ideal but it fits out needs almost perfectly.

That is, until the need to make some changes to the files, which I think might be a little harder. The end goal should be a package directory sitting in build which contains all of the properly structured appropriately formatted files.
The “workspace” directory I’ve defined above is just an interim measure, but preferably I wouldn’t need to be creating copies of the files all over the place to get something relatively simple done.

Rather than then having another job where I perform changes on the files, then copy them again into the package directory, is there any way to apply changes to the files while they are in motion?

To give an actual example, all of the files that are going into the “build/workspace/$name/$environmentkey/import/files” directory need to:

  • be base64 encoded so they are consumable by another system
  • Have a header attached which contains a @PATH@ token which needs to be replaced with the original relative file path
  • End up in that same package directory so we can throw it into our maven repo.

Currently, looking at all the options I can find it looks like I would need to write a class which accepts two files (one of which would be my static header), base64 encodes the non-header file, appends the encoded data to the header file, replaces the @PATH@ token in the header file with the actual file path relative to the project directory, then outputs that file to the place where I was going to be copying it to. But that seems a little complicated for my current familiarity with gradle so I’m wondering if there is something more simple that I’ve missed.

You could do this in a single copy using Copy.with(CopySpec) and a Filter

Eg:

copy {
   with copySpec {
      from "input/xxx"
      into "$build/foo"
      filter(x.y.MyCustomFilter)
   }
   with copySpec {
      from "input/yyy"
      into "$build/bar"
      filter(ReplaceTokens, tokens: [aaa: '111', bbb: '222'])
   }
}

Note that as of 2.14-rc1 you can specify the filter charset thanks to this improvement

Hi Lance,

Thanks for the reply, looking at using filters does seem to solve the requirement to add a header onto the file and sub in a replacement for a token.

What I can’t seem to figure out now is how to get the base64 encoding to work, and problematically this obviously needs to happen first.

Currently the files are all running through an ant script which executes /usr/bin/base64 on the file content, then it substitutes the file content into a an existing template which has both a @CONTENT@ and @PATH@ token. The problem is that while I could just execute the base64 bin on the file and do essentially the same thing, we are moving to a more restrictive build environment where by default we don’t have any permissions on files not deemed “secure and critical” so we are actually losing the ability to execute the base64 bin.

To work around that we can obviously just write a java class which base64 encodes the content if we just read it in as a string but from what I’ve seen that activity requires that we read a file, base64 encode it out to a file. Then we could read in the new file and use filters to get the desired output. Which, while a pretty minor complaint, does still leave us with duplicate copies.

I’m not sure if I’m just misunderstanding filters, but it doesn’t seem like they can help me get the original raw file encoded before appending the rest of the text.

This reminded me of a problem I had to solve some time ago: https://discuss.gradle.org/t/generating-multiple-source-files-from-a-template/14800.

In your case it might be better to perform the base64 first as a separate task (or a copy step). You cannot use a line filter, so you might want to try

eachFile { fcd ->
  String content = yourBase64Method(fcd.file.text)  
  fcd.file.text = content
}

Obviously that is a bit naive - if you have large files you might run out of memory.

Then you just need to use second copy which combines an Ant ConcatFilter & ReplaceTokens filter

eachFile { fcd ->
  // I've not tested this line - you'll need to do some reading on 
  // FileCopyDetails & RelativePath classes from the Gradle API.
  String relativePathName = fcd.relativeSourcePath.pathString

  // Add your header
  fcd.filter ConcatFilter prepend : '/path/to/your/header/template'

 // Now have the tokens in the header replaced. 
  fcd.filter ReplaceTokens, tokens : [  PATH : "${relativePathName}" ]
}
1 Like