Andrea Giammarchi (2015-02-16T18:18:45.000Z)
> In fact, you can specify a generator function with an infinite loop (like
the infamous while (true) { .. }) that essentially never finishes. While
that's usually madness or a mistake in a normal JS program, with generator
functions it's perfectly sane and sometimes exactly what you want to do!

just probably the most read piece about JS generators ....
http://davidwalsh.name/es6-generators

Hence the potential problem caused by `while (!yielded.next().done)`
thousands of objects ... I agree that's not the best way to go with
generators, but I've seen many code-bases based on similar automagic
resolution.

Other example in going async and the `runGenerator` function:
http://davidwalsh.name/async-generators

```js

// run (async) a generator to completion// Note: simplified approach:
no error handling herefunction runGenerator(g) {
    var it = g(), ret;

    // asynchronously iterate over generator    (function iterate(val){
        ret = it.next( val );

        if (!ret.done) {
            // poor man's "is it a promise?" test            if
("then" in ret.value) {
                // wait on the promise                ret.value.then( iterate );
            }
            // immediate value: just send right back in            else {
                // avoid synchronous recursion
setTimeout( function(){
                    iterate( ret.value );
                }, 0 );
            }
        }
    })();}

```

And since one of the purpose/side-effect of generators is to make code look
synchronous, I don't think Kyle went too far or anything, he simply showed
some pattern that I've seen already applied here and there.

So yes, behind libraries or on users hands, the amount of garbage can be
"too damn high"

Best Regards




On Mon, Feb 16, 2015 at 3:45 PM, Andreas Rossberg <rossberg at google.com>
wrote:

> On 16 February 2015 at 15:41, Andrea Giammarchi <
> andrea.giammarchi at gmail.com> wrote:
>
>> Common pattern is to poll.next() a yield until its `done` property is
>> `true` so that a value can be used.
>>
>> This is I believe the common case that will create thousands of objects
>> to be quickly trashed as garbage ... so I was wondering if those are all
>> needed
>>
>
> Er, I don't think this is the common use case at all. You iterate over
> something to process values, otherwise there isn't much point in using
> iterators in the first place.
>
> /Andreas
>
>
>
>> On Mon, Feb 16, 2015 at 2:38 PM, Andrea Giammarchi <
>> andrea.giammarchi at gmail.com> wrote:
>>
>>> then frozen `{done: false}` without even the `value` property ... Would
>>> this work or speed-up anything at all?
>>>
>>> On Mon, Feb 16, 2015 at 2:32 PM, Andreas Rossberg <rossberg at google.com>
>>> wrote:
>>>
>>>> On 16 February 2015 at 15:21, Andrea Giammarchi <
>>>> andrea.giammarchi at gmail.com> wrote:
>>>>
>>>>> > Shared mutable result objects
>>>>>
>>>>> FWIW to me that could be even be a singleton frozen `{done: false,
>>>>> value: null}` constant during iteration and a new object only once done.
>>>>>
>>>>> Would this work or speed-up anything at all?
>>>>>
>>>>
>>>> Frozen value null? I don't understand. You'd get the actual values
>>>> from... where?
>>>>
>>>> /Andreas
>>>>
>>>>
>>>> On Mon, Feb 16, 2015 at 2:04 PM, Andreas Rossberg <rossberg at google.com>
>>>>> wrote:
>>>>>
>>>>>> On 15 February 2015 at 12:07, Katelyn Gadd <kg at luminance.org> wrote:
>>>>>>
>>>>>>> I'm certainly in favor of VMs improving to handle that, and adding
>>>>>>> pressure for it is good. However, optimizing a TypedArray temporary
>>>>>>> arg to .set() is a much simpler problem than doing the escape
>>>>>>> analysis
>>>>>>> necessary to be certain a .next() result doesn't escape from a
>>>>>>> calling
>>>>>>> scope and isn't used after a later next() call. Applying pressure
>>>>>>> will
>>>>>>> be a good way to make sure VM authors do the work necessary for this
>>>>>>> to happen, but if iterators are unacceptably slow in shipping
>>>>>>> implementations for a year+ I think the odds are good that most
>>>>>>> shipping software will avoid using them, at which point VM authors
>>>>>>> will have no reason to optimize for primitives nobody uses. =[
>>>>>>>
>>>>>>
>>>>>> Engines are still ramping up on ES6 features, and most probably
>>>>>> haven't been able to put much resources into optimisations yet (ES6 is
>>>>>> large!). You can't compare it to old features that have been tuned for a
>>>>>> decade and expect equal performance. This situation is unfortunate but
>>>>>> unavoidable.
>>>>>>
>>>>>> Either way, it doesn't justify cutting corners in the semantics in a
>>>>>> naive and premature attempt to optimise (which might even harm more
>>>>>> highl-level optimisations on the long run). Shared mutable result objects
>>>>>> would be a horrible API with footgun potential, especially when you start
>>>>>> to build stream abstractions on top. FWIW, this has been discussed at
>>>>>> length at some of the meetings.
>>>>>>
>>>>>> The fixed layout of the iterator object would allow the GC to allocate
>>>>>>> it cheaply and in the case of values (like ints) it wouldn't need to
>>>>>>> trace it either - so that helps a lot. But I don't know how realistic
>>>>>>> those optimizations are in practice.
>>>>>>>
>>>>>>
>>>>>> Not sure what you mean here. Unfortunately, result objects are still
>>>>>> mutable and extensible, so anything can happen.
>>>>>>
>>>>>> /Andreas
>>>>>>
>>>>>>
>>>>>> On 15 February 2015 at 02:36, Andrea Giammarchi
>>>>>>> <andrea.giammarchi at gmail.com> wrote:
>>>>>>> > +1 and I've raised same concerns 2 years ago [1]
>>>>>>> >
>>>>>>> > IIRC the outcome was that VM should be good enough to handle
>>>>>>> objects with
>>>>>>> > very short lifecycle, I'm still convinced (behind tests) that
>>>>>>> generators are
>>>>>>> > overkill for IoT devices (low clock and way lower RAM).
>>>>>>> >
>>>>>>> > Having always same object per iteration makes sense to me at least
>>>>>>> until
>>>>>>> > it's done so that could be just a struct-like `{done: false,
>>>>>>> value: null}`
>>>>>>> > object and GC will be happier than ever.
>>>>>>> >
>>>>>>> > Regards
>>>>>>> >
>>>>>>> >
>>>>>>> > [1]
>>>>>>> >
>>>>>>> http://webreflection.blogspot.co.uk/2013/06/on-harmony-javascript-generators.html
>>>>>>> >
>>>>>>> > On Sun, Feb 15, 2015 at 10:06 AM, Katelyn Gadd <kg at luminance.org>
>>>>>>> wrote:
>>>>>>> >>
>>>>>>> >> As specified, iterator .next() seems to be required to return a
>>>>>>> new
>>>>>>> >> object instance for each iteration.
>>>>>>> >>
>>>>>>> >> In my testing (and in my theory, as an absolute) this is a real
>>>>>>> >> performance defect in the spec and it will make iterators
>>>>>>> inferior to
>>>>>>> >> all other forms of sequence iteration, to the extent that they
>>>>>>> may end
>>>>>>> >> up being used very rarely, and developers will be biased away
>>>>>>> from Map
>>>>>>> >> and Set as a result.
>>>>>>> >>
>>>>>>> >> The issue here is that the new object requirement means that every
>>>>>>> >> iteration produces GC pressure. I think that past APIs with this
>>>>>>> >> problem (for example TypedArray.set) have proven that 'a
>>>>>>> sufficiently
>>>>>>> >> smart VM can optimize this' is not representative of real VMs or
>>>>>>> real
>>>>>>> >> use cases.
>>>>>>> >>
>>>>>>> >> In the specific case of .next(), the method returning a new
>>>>>>> object on
>>>>>>> >> every iteration does not produce any actual improvement to
>>>>>>> usability:
>>>>>>> >> There is no realistic use case that requires saving multiple
>>>>>>> next()
>>>>>>> >> results from the same sequence, as the sequence itself represents
>>>>>>> (at
>>>>>>> >> least in most cases) a container or generated value sequence that
>>>>>>> is
>>>>>>> >> fully reproducible on demand.
>>>>>>> >>
>>>>>>> >> I think allowing (or requiring) implementations to return the same
>>>>>>> >> object instance from every .next() call, or perhaps as a usability
>>>>>>> >> compromise, reusing a pair of objects on a round-robin basis (so
>>>>>>> that
>>>>>>> >> you can keep around the current and prior result) would be a very
>>>>>>> good
>>>>>>> >> decision here.
>>>>>>> >>
>>>>>>> >> In my testing Map and Set are outperformed by a trivial Object or
>>>>>>> >> Array based data structure in every case, *despite the fact* that
>>>>>>> >> using an Object as a Map requires the use of Object.keys() to be
>>>>>>> able
>>>>>>> >> to sequentially iterate elements. The cost of iterator.next() in
>>>>>>> v8
>>>>>>> >> and spidermonkey is currently extremely profound and profiling
>>>>>>> shows
>>>>>>> >> all the time is being spent in object creation and GC. (To be
>>>>>>> fair,
>>>>>>> >> self-hosting of iterations might improve on this some.)
>>>>>>> >>
>>>>>>> >> Oddly enough, I consider the ES iterator spec to be a big
>>>>>>> improvement
>>>>>>> >> over C#'s IEnumerable, in terms of usability/API. But this is an
>>>>>>> area
>>>>>>> >> where it is intrinsically worse performance-wise than IEnumerable
>>>>>>> and
>>>>>>> >> that's unfortunate.
>>>>>>> >>
>>>>>>> >> -kg
>>>>>>> >> _______________________________________________
>>>>>>> >> es-discuss mailing list
>>>>>>> >> es-discuss at mozilla.org
>>>>>>> >> https://mail.mozilla.org/listinfo/es-discuss
>>>>>>> >
>>>>>>> >
>>>>>>> _______________________________________________
>>>>>>> es-discuss mailing list
>>>>>>> es-discuss at mozilla.org
>>>>>>> https://mail.mozilla.org/listinfo/es-discuss
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/es-discuss/attachments/20150216/45423300/attachment-0001.html>
d at domenic.me (2015-02-21T00:39:44.971Z)
In fact, you can specify a generator function with an infinite loop (like
the infamous while (true) { .. }) that essentially never finishes. While
that's usually madness or a mistake in a normal JS program, with generator
functions it's perfectly sane and sometimes exactly what you want to do!

just probably the most read piece about JS generators ....
http://davidwalsh.name/es6-generators

Hence the potential problem caused by `while (!yielded.next().done)`
thousands of objects ... I agree that's not the best way to go with
generators, but I've seen many code-bases based on similar automagic
resolution.

Other example in going async and the `runGenerator` function:
http://davidwalsh.name/async-generators

```js

// run (async) a generator to completion// Note: simplified approach:
no error handling herefunction runGenerator(g) {
    var it = g(), ret;

    // asynchronously iterate over generator    (function iterate(val){
        ret = it.next( val );

        if (!ret.done) {
            // poor man's "is it a promise?" test            if
("then" in ret.value) {
                // wait on the promise                ret.value.then( iterate );
            }
            // immediate value: just send right back in            else {
                // avoid synchronous recursion
setTimeout( function(){
                    iterate( ret.value );
                }, 0 );
            }
        }
    })();}

```

And since one of the purpose/side-effect of generators is to make code look
synchronous, I don't think Kyle went too far or anything, he simply showed
some pattern that I've seen already applied here and there.

So yes, behind libraries or on users hands, the amount of garbage can be
"too damn high"