Skip to content

[feature request] Streaming wrapper #59

Open
@shaicoleman

Description

@shaicoleman

I'm trying to nest streams in order to have streaming compressed uploads/downloads of large files with S3

With zlib, I could use Zlib::GzipWriter.wrap / Zlib::GzipReader.wrap, but there doesn't seem to be an equivalent for zstd-ruby.

Example zlib code, which I would like to migrate to zstd-ruby:

require 'aws-sdk-s3'
require 'zlib'

s3_bucket = 'my_s3_bucket'
s3_key = 'my_s3_key'
filename = '/my/path'

s3_object = Aws::S3::Object.new(s3_bucket, s3_key)
s3_object.upload_stream do |s3_stream|
  Zlib::GzipWriter.wrap(s3_stream, ::Zlib::BEST_COMPRESSION) do |gz|
    File.open(filename) do |f|
      IO.copy_stream(f, gz)
    end
  end
end

Zlib::GzipWriter.wrap / Zlib::GzipReader.wrap documentation:
https://ruby-doc.org/3.2.2/exts/zlib/Zlib/GzipFile.html#method-c-wrap

For similar functionality, I currently need to manually chunk the file, e.g.

CHUNK_SIZE = 1024 * 1024
zstd_stream = Zstd::StreamingCompress.new
s3_object = Aws::S3::Object.new(@bucket, key)
s3_object.upload_stream do |s3_stream|
  s3_stream << zstd_stream.compress('')
  File.open(filename) do |file|
    while (chunk = file.read(CHUNK_SIZE))
      s3_stream << zstd_stream.compress(chunk)
    end
  end
  s3_stream << zstd_stream.finish
end

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions