Publishing large files to a maven on S3 fails when resetting the input stream

While publishing a large file, 1.6GiB, to an S3 bucket using the maven-publish plugin, I keep getting the following exception in Gradle 6.0:

Caused by: com.amazonaws.ResetException: The request to the service failed with a retryable reason, but resetting the request input stream has failed. See exception.getExtraInfo or debug-level logging for the original failure that caused this retry.; If the request involves an input stream, the maximum stream buffer size can be configured via request.getRequestClientOptions().setReadLimit(int)

According to various resources, including, it seems this is due to the reset it occurring further back that the read limit on the stream. One can change the read limit, either by calling the function mentioned in the error, or by passing the following option to the AWS SDK:

-Dcom.amazonaws.sdk.s3.defaultStreamBufferSize=YOUR_MAX_PUT_SIZE

I’ve verified that the option above modifies the read limit accordingly, but at the moment I’m running out of Java heap space, so the upload still fails.

I’m not sure if the S3 upload API needs to be used in a different way to support retries for large files, or if running with a large Java heap is my only option, but any suggestions are welcome.