-
Notifications
You must be signed in to change notification settings - Fork 10
Open
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomerswaiting-external-prWaiting for PR from an external contributorWaiting for PR from an external contributor
Description
Overview
At the moment we have a nice streaming experience when consuming messages, but on the producing side, it’s all about producer.send()
.
This proposal introduces producer.asWritable()
(or some better name for it )
Why?
- Because it makes streaming to Kafka feel like streaming anywhere else in Node.
- Native back pressure support
- Optional batching (by size or time)
- Built-in delivery reports, when folk need manual commits
- Stream composability, so you can plug it into pipeline() and drop in handy tools like Transform/Syncthrough
What it could look like?
const sink = producer.asWritable({
objectMode: true,
highWaterMark: 1600,
batchSize: 500, // max messages per flush
batchTime: 20, // flush after this many ms (even if under batchSize)
deliveryReport: 'per-message' // or 'none' | 'per-batch'
})
sink.on('delivery-report', report => {
console.log('Acked:', report)
})
Pipeline example
const readable = new Consumer({ /* ... */ })
const producer = new Producer({ /* ... */ })
const sink = producer.asWritable({
batchSize: 1000,
batchTime: 50,
deliveryReport: 'per-batch'
})
await pipeline(readable,transform, syncthrogh, window, etc..., sink)
This opens the door to writing small, composable stream transforms ideal for windowing, aggregation, enrichment...
basically a light, flexible foundation for poor man’s Kafka Streams
Feedback welcome naming, defaults, event shapes, or anything else that feels off.
mcollina, scarletquasar and kibertoad
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomerswaiting-external-prWaiting for PR from an external contributorWaiting for PR from an external contributor