Package-level declarations
Functions
Link copied to clipboard
fun S3AsyncClient.mergeContents(bucket: String, key: String, concurrency: Int = 1, files: List<Pair<String, String>>): Flow<S3Response>
This function performs a merge by using multipart upload copy operations using the S3AsyncClient. It copies multiple source files into a single destination object in an S3 bucket.
Link copied to clipboard
fun S3AsyncClient.selectObjectContent(request: SelectObjectContentRequest.Builder.() -> Unit): Flow<SelectObjectContentEventStream>
A function that performs a select object content operation on a file from an Amazon S3 bucket, which allows retrieving a subset of data from an object by using simple SQL expressions.
Link copied to clipboard
fun S3AsyncClient.upload(upstream: Flow<ByteArray>, concurrency: Int = 1, initialRequest: CreateMultipartUploadRequest.Builder.() -> Unit): Flow<S3Response>
fun S3AsyncClient.upload(bucket: String, key: String, upstream: Flow<ByteArray>, concurrency: Int = 1): Flow<S3Response>
Creates a flow that uploads byte arrays to an Amazon S3 bucket.
Link copied to clipboard
fun S3AsyncClient.uploadBytes(upstream: Flow<Byte>, concurrency: Int = 1, initialRequest: CreateMultipartUploadRequest.Builder.() -> Unit): Flow<S3Response>
fun S3AsyncClient.uploadBytes(bucket: String, key: String, upstream: Flow<Byte>, concurrency: Int = 1): Flow<S3Response>
Creates a flow that uploads bytes to an Amazon S3 bucket.
Link copied to clipboard
fun S3AsyncClient.uploadSplit(bucket: String, upstream: Flow<Byte>, splitStrategy: GroupStrategy = GroupStrategy.Count(ONE_MB), concurrency: Int = 1, key: (Int) -> String): Flow<S3Response>
This function uploads a file in chunks to an Amazon S3 bucket using the S3AsyncClient.
Link copied to clipboard
fun <T> S3AsyncClient.uploadSplitItems(bucket: String, upstream: Flow<T>, splitStrategy: GroupStrategy = GroupStrategy.Count(1000), concurrency: Int = 1, key: (Int) -> String, f: suspend (T) -> ByteArray): Flow<S3Response>
This function uploads a file in chunks to an Amazon S3 bucket using the S3AsyncClient.