Async iterator destructuring?
# Bruno Macabeus (6 years ago)
But is it not enough to do that?
const [item] = await db.scan({
...
# Bergi (6 years ago)
It might be useful on some occasions to [collect] async iterators [into an array].
No need for destructuring or spreading here. The iterator helpers proposal tc39/proposal-iterator-helpers already
covers these:
return Buffer.concat(await someStream.setEncoding('buffer').toArray())
// It's different for each database
const [item] = await db.scan({
filter: {key: value},
limit: 1,
}).toArray()
kind , Bergi
It might be useful on some occasions to destructure async iterators. For the sake of example, I'm using
await [...]as the syntax, but I'm by no means married to it.const await [...buffers] = someStream.setEncoding('buffer') return Buffer.concat(buffers)// It's different for each database const await [item] = db.scan({ filter: {key: value}, limit: 1, })It's not a common need, but it's useful either way, and it brings async iterators and sync iterators closer to feature parity.
It might be useful on some occasions to destructure async iterators. For the sake of example, I'm using `await [...]` as the syntax, but I'm by no means married to it. 1. Collecting a Node stream into a buffer: ```js const await [...buffers] = someStream.setEncoding('buffer') return Buffer.concat(buffers) ``` 2. Collecting the first matching entry in a database scan: ```js // It's different for each database const await [item] = db.scan({ filter: {key: value}, limit: 1, }) ``` It's not a common need, but it's useful either way, and it brings async iterators and sync iterators closer to feature parity. -- ----- Isiah Meadows contact at isiahmeadows.com www.isiahmeadows.com -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://mail.mozilla.org/pipermail/es-discuss/attachments/20200207/2c6e3c53/attachment.html>