Naveen Chawla (2017-07-31T10:38:32.000Z)
Yes, you need to intervene and reject the latest promise upon timeout (by
having a reference to its "reject" callback).

This makes me wonder (and I'd like to be corrected if wrong) if async
iterators are more of a hindrance than a help?

We can currently do a loop over an array of promises, without async
iterators:

```javascript
async requestLoopAsync(){
    for(const requestItemPromise of requestItemPromises){
        //We can intervene here BEFORE we await the promise, unlike
        //with async iterators e.g.
requestItemPromise.myRejectCallbackReference()

        const response = await requestItemPromise; //etc.
    }
}
```

Am I right or wrong?

(For the timeout example, we could do `currentRequestItemPromise =
requestItemPromise` then on timeout do
`currentRequestItemPromise.myRejectCallbackReference()` (where
`myRejectCallbackReference` was assigned when we created the promise e.g.
`this.myRejectCallbackReference = reject` from the `reject` parameter in
`(resolve, reject)=>`)

On Mon, 31 Jul 2017 at 10:40 kai zhu <kaizhu256 at gmail.com> wrote:

> the timeout handler will not work as advertised, e.g. what if io / db
> issues causes a network stream to intermittently respond in intervals far
> greater than 30000ms or not at all?
>
> > On Jul 31, 2017, at 7:26 AM, James Browning <thejamesernator at gmail.com>
> wrote:
> >
> > It'll look something like this:
> >
> > ```javascript
> >
> > async function consumeReadableStream(stream) {
> >     const start = Date.now()
> >     for await (const chunk of stream) {
> >
> >        /* Do whatever you want with the chunk here e,g, await other
> > async tasks with chunks
> >            send them off to wherever, etc
> >        */
> >
> >         if (Date.now() - start > 30000) {
> >             throw new Error('30000 ms timeout')
> >        }
> >    }
> >    /* Instead of callbackOnce the returned promise from this function
> > itself can be used */
> > }
> >
> > ```
> > _______________________________________________
> > es-discuss mailing list
> > es-discuss at mozilla.org
> > https://mail.mozilla.org/listinfo/es-discuss
>
> _______________________________________________
> es-discuss mailing list
> es-discuss at mozilla.org
> https://mail.mozilla.org/listinfo/es-discuss
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/es-discuss/attachments/20170731/43f20fdd/attachment.html>
naveen.chwl at gmail.com (2017-07-31T10:47:04.731Z)
Yes, you need to intervene and reject the latest promise upon timeout (by
having a reference to its "reject" callback).

This makes me wonder (and I'd like to be corrected if wrong) if async
iterators are more of a hindrance than a help?

We can currently do a loop over an array of promises, without async
iterators:

```javascript
async requestLoopAsync(){
    for(const requestItemPromise of requestItemPromises){
        //We can assign this here BEFORE we await the promise, unlike
        //with async iterators e.g.
        currentRequestItemPromise = requestItemPromise;

        const response = await requestItemPromise; //etc.
    }
}
```

Am I right or wrong?

(For the timeout example, on timeout we could do
`currentRequestItemPromise.myRejectCallbackReference()` (where
`myRejectCallbackReference` was assigned when we created the promise e.g.
`this.myRejectCallbackReference = reject` from the `reject` parameter in
`(resolve, reject)=>`)