Skip to content

Feature Request: Producer Writable stream #78

@patrykwegrzyn

Description

@patrykwegrzyn

Overview

At the moment we have a nice streaming experience when consuming messages, but on the producing side, it’s all about producer.send() .

This proposal introduces producer.asWritable() (or some better name for it )

Why?

  • Because it makes streaming to Kafka feel like streaming anywhere else in Node.
  • Native back pressure support
  • Optional batching (by size or time)
  • Built-in delivery reports, when folk need manual commits
  • Stream composability, so you can plug it into pipeline() and drop in handy tools like Transform/Syncthrough

What it could look like?

const sink = producer.asWritable({
  objectMode: true,
  highWaterMark: 1600,
  batchSize: 500,             // max messages per flush
  batchTime: 20,              // flush after this many ms (even if under batchSize)
  deliveryReport: 'per-message' // or 'none' | 'per-batch'
})

sink.on('delivery-report', report => {
  console.log('Acked:', report)
})

Pipeline example

const readable = new Consumer({ /* ... */ })
const producer = new Producer({ /* ... */ })

const sink = producer.asWritable({
    batchSize: 1000,
    batchTime: 50,
    deliveryReport: 'per-batch'
  })

await pipeline(readable,transform, syncthrogh, window, etc...,  sink)

This opens the door to writing small, composable stream transforms ideal for windowing, aggregation, enrichment...
basically a light, flexible foundation for poor man’s Kafka Streams

Feedback welcome naming, defaults, event shapes, or anything else that feels off.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions