generators vs forEach

# Claus Reinke (11 years ago)

prompted by this nodejs list thread "Weird error with generators (using suspend or galaxy)"

  1. higher order functions are used to model control structures

  2. generators/yield are designed to allow for suspend/resume of control structure code

These two statements come in conflict if one considers the restriction that generators be based on flat continuations, which is sufficient to span built-in control structures like "for" but not predefined control structures like "forEach". The support for nested generators ("yield*") differs from normal function call operation.

I have not seen this conflict discussed here, so I wanted to raise it in case it was an oversight and something can be done about it. As far as I can tell, there are two issues:

  • current predefined operations like "forEach", "map", "filter", .. are not fully integrated with generators, even though they model synchronous operations; expecting users to duplicate their functionality for use with generators seems wrong;

  • is it even possible to define higher-order operations that can be used both normally (without "yield" inside their callbacks, without "yield" wrapping their result) and with generators (with "yield" inside their callbacks, with "yield" wrapping their result)?

# Jeremy Martin (11 years ago)

Alternatively, could yield simply be lexically bound to the nearest GeneratorFunction scope, rather than the nearest Function?

E.g., instead of:

suspend(function* (resume) {
  yield setTimeout(resume, 1000);
  console.log('foo');
  yield setTimeout(resume, 1000);
  console.log('bar');
})();

... we could write:

suspend(function* (resume) {
  ['foo', 'bar'].forEach(function(word) {
    yield setTimeout(resume, 1000);
    console.log(word);
  });
})();

The current state of things here is pretty ugly, and I'd really like to avoid having to add something like suspend.forEach(Array, GeneratorFunction) with yield* in the body.

# Domenic Denicola (11 years ago)

I believe duplicating the Array.prototype built-ins as generator versions, in user-code, is the expected path forward. Perhaps an "itertools"-like module will be standardized and added in ES7, after that cowpath has been paved.

This pain is somewhat alleviated by generator expressions (which cover filter and map) and for-of (which covers the rest, more manually).

# Mark S. Miller (11 years ago)

This would make generators deep, violating the non-interleaving assumptions of intermediate callers on the call stack. This is why we accepted generators only on condition that they be shallow. We knew at the time that this privileges built-in control structures over user defined ones. The alternative would have been to omit generators completely. We agree that shallow generators were worth it, despite this non-uniformity.

Put another way, shallow generators are equivalent to a local cps transform of the generator function itself. Deep generators would require the equivalent of CPS transforming the world -- violating the stateful assumptions of existing code.

# Jeremy Martin (11 years ago)

Correct me if I'm wrong, but wouldn't the lexical-scoping restraint satisfy the shallow generator requirement? As I understand it, the issue with deep generators and CPS transformations is that the transformations have to be applied to functions that aren't even lexically inside the GeneratorFunction. Additionally, can't the nested CPS transformations issue by alleviated with a reference to the GeneratorFunction stack frame itself (a la SpiderMonkey)?

# Brandon Benvie (11 years ago)

On 7/15/2013 10:24 AM, Jeremy Martin wrote:

Correct me if I'm wrong, but wouldn't the lexical-scoping restraint satisfy the shallow generator requirement? As I understand it, the issue with deep generators and CPS transformations is that the transformations have to be applied to functions that aren't even lexically inside the GeneratorFunction. Additionally, can't the nested CPS transformations issue by alleviated with a reference to the GeneratorFunction stack frame itself (a la SpiderMonkey)?

Consider the following:

function* yieldEach(array){
 array.forEach(n => {
   yield n;
 });
}

In order for this to work, not only does yieldEach have to be suspended for the inner yield, but forEach does as well. That means CPS transforming functions based on whether they call a yielding function.

# Jeremy Martin (11 years ago)

That means CPS transforming functions based on whether they call a yielding function.

Well, yes :) Admittedly, that does seem messy. Not to sound defeatist, but I'm sensing that I don't actually have the necessary knowledge to further argue this. I'll just end with saying that it'll be unfortunate if we're going to have to endure the clunkiness of nested generators anywhere higher-order functions are called for. Regardless of whatever solutions may or may not be appropriate, I at a minimum echo Claus' sentiment - it seems like there's a nice opportunity here to improve the modeling of control structures.

# Brendan Eich (11 years ago)

Jeremy Martin wrote:

Regardless of whatever solutions may or may not be appropriate, I at a minimum echo Claus' sentiment - it seems like there's a nice opportunity here to improve the modeling of control structures.

You're right, but the problem Mark cites is a big one and it stops any general call/cc or coroutine extension from going into JS.

I'd just add that call/cc in Scheme is a power tool, and (I'm told) you're supposed to use macros built on it, usually. The macros for delimited continuations address a sweet spot for control abstractions, and Dave Herman did give them careful consideration. Dave alludes to that work here:

esdiscuss/2010-December/012284

viz, "I pretty much abandoned that line of investigation with the conclusion that generators: ...", which I believe refers to

strawman:shallow_continuations

which is deferred and pretty much defunct due to generators.

# Bruno Jouhier (11 years ago)

There is no need to CPS transform functions and there is no need for deferred functions either. It can all be done with today's generators and a little helper library.

With the C# async/await notation:

  • The yield keyword is your "await" keyword.
  • The little * in function* is your "async" keyword.

With this you can write:

function* asyncEach(array, fn) {
  for (var i = 0; i < array.length; i++) yield fn(array[i], i);
}

and you can call it as:

function* myFunc(array) {
  yield asyncEach(array, function*(elt, i) {
    var foo = yield asyncBar(elt);
    // more ...
  })
}

Note that there is no helper API in all these async functions that call other async functions; The helper API is only needed to interface this with the classical callback world: at the very bottom of the stack when you call low level callbacks-based I/O functions, and at the top of the stack when the event loop runs one of your generator functions.

The trick is that you need a clever run function to do the little yield/next dance with generator functions that call other generator functions.

I've implemented this in bjouhier/galaxy/blob/master/lib/galaxy.js (the run and invoke functions)

The only thing I don't like about it is the awkward syntax:

  • yield precedence does not work well
  • yield is prefix, which does not chain well
  • and yield is heavy anyway

In short, this is a hack to get going but I'm still waiting for the full concurrency proposal and its awesome ! syntax.

# Ron Buckton (11 years ago)

Bruno, wouldn't yield* work here to delegate the inner yields?

# Ron Buckton (11 years ago)

I assume you are referring to something like Q.async/Q.spawn to turn the generator into an async function using promises?

# Bruno Jouhier (11 years ago)

I thought about yield* but it was not available at the time I wrote this. But it does not fit the bill because it deals with the synchonous part of the generator dance, not the async part. My run loop is if a bit more than the yield* loop (but not much more). The differences are the tests that deal with the PENDING case (bjouhier/galaxy/blob/master/lib/galaxy.js lines 83 and 88).

I'm not using Q. I wanted to have minimal runtime overhead so I designed it so that it does not allocate any extra objects. It's all done with a simple loop.

# Claus Reinke (11 years ago)

This would make generators deep, violating the non-interleaving assumptions of intermediate callers on the call stack. This is why we accepted generators only on condition that they be shallow. We knew at the time that this privileges built-in control structures over user defined ones. The alternative would have been to omit generators completely. We agree that shallow generators were worth it, despite this non-uniformity.

While I understand the compromise, and the wish to get in some form of generators anyway, the discrimination against user-defined control structures troubles me deeply. It introduces a new language construct that defies abstraction. It means that we can no longer use functional abstraction freely, but have to worry about interactions with generators.

For the specific case of forEach et al, another way to avoid intermediate stack frames would be guaranteed inlining. If we always inline .forEach before execution, then specialize the resulting code wrt the callback, any yields in the callback would be directly in the caller. Consider this chain of code transformations:

// inline forEach; this still doesn't work
function* generator(){
    (function forEach(arr,cb) {
        for (var i=0; i<arr.length; i++) cb(arr[i]);
    })([1,2,3], function(x){ yield x } );
}

// instantiate inlined forEach; still doesn't work
function* generator(){
    let arr = [1,2,3];
    let cb = function(x){ yield x };
    for (var i=0; i<arr.length; i++) cb(arr[i]);
}

// inline cb; still doesn't work
function* generator(){
    let arr = [1,2,3];
    for (var i=0; i<arr.length; i++) (function(x){ yield x})(arr[i]);
}

// instantiate inlined cb; this should work
function* generator(){
    let arr = [1,2,3];
    for (var i=0; i<arr.length; i++) yield arr[i];
}

If such inlining and instantiating functions in ES6 changes the validity of code, then the opposite path -building abstractions from concrete code examples- is also affected. I find that worrying.

The final form of the code can be handled with shallow generators, and it should be semantically equivalent to the initial form (just function application and variable instantiation in between). So why shouldn't both forms be valid and doable without overcomplicating the shallow generator ideal?

In pragmatic terms, perhaps introducing inline annotations for operations like .forEach and for their callback parameters could avoid nested stack frames her without forcing user-side code duplication. Such annotation-enforced inlining should also help with performance of .forEach et al (currently behind for-loops).

[in conventional pre-compiling FPL implementations, such worker/ wrapper staging plus inlining is done at compile time (stage recursive higher-order function into non-recursive wrapper and recursive but not higher-order worker; inline wrapper to instantiate the functional parameters in the nested worker; finally apply standard optimizer);

it is an easy way to avoid deoptimizations caused by higher-order parameters interfering with code analysis; provided the library author helps with code staging and inline annotations ]

Put another way, shallow generators are equivalent to a local cps transform of the generator function itself. Deep generators would require the equivalent of CPS transforming the world -- violating the stateful assumptions of existing code.

FYI:

I'm not sure what you mean by "violating the stateful assumptions" but there is an even more local transform than that for ES6 generators: writing code in monadic style always captures the local continuation only. That allows for generator monads that compose those local continuations back together.

An example of such a generator monad can be found here (using a list of steps for simplicity; code is TypeScript v0.9, to make use of ES6 classes with class-side inheritance and arrow functions)

https://gist.github.com/clausreinke/5984869#file-monadic-ts-L91-L125

with example code (using user-defined forOf) at

https://gist.github.com/clausreinke/5984869#file-monadic-ts-L492-L529

This differs from ES6 generators in using a functional API (next returns {done,value,next}) and in building on expressions and user-defined control-flow operations instead of statement blocks and built-in control-flow structures. Still, this style does seem to allow more reuse of existing ES5 array operations than ES6 generators will, as this small example demonstrates:

console.log("\n// mixing yield with higher-order array ops (prefix ^)");
var generator4 = ()=> [1,2,3].map( x=> G.yield(x) )
                             .reduce( (x,y)=> x.then( _=> y ), G.of(undefined) ) ;
MonadId.forOf( generator4(), y=> (console.log("^ "+y), MonadId.of(y)) );

append example to the end of that gist, execute with tsc -e (TS v0.9 required, playground or npm will do):

...
// mixing yield with higher-order array ops (prefix ^)
^ 1
^ 2
^ 3

That gist doesn't depend on language extensions, though some form of syntax sugar for monad comprehensions would help readability. And such monadic syntax sugar would help all monads, eg, there is a Promise monad in that same file.

# Mark S. Miller (11 years ago)

On Tue, Jul 16, 2013 at 10:54 AM, Claus Reinke <claus.reinke at talk21.com>wrote:

While I understand the compromise, and the wish to get in some form of generators anyway, the discrimination against user-defined control structures troubles me deeply.

Troubles me too. As of ES6 the only possible alternative would be to remove generators from the language. I can't see that happening.

It introduces a new language construct that defies abstraction. It means that we can no longer use functional abstraction freely, but have to worry about interactions with generators.

For the specific case of forEach et al, another way to avoid intermediate stack frames would be guaranteed inlining.

If this was the only motivation for introducing something like a guaranteed inline-ability annotation, I would not think it worth the price. However, such an annotation may serve multiple purposes. The key constraint it would need to impose is that the function is closed -- it doesn't capture any lexical variables other than the ES* defined globals. This is a minimal requirement for inlining, for parallelizability by Rivertrail, and for safe mobile code as in Q.there.

In any case, since functions by default are encapsulated, some explicit annotation or syntax of some sort is required for the function to waive its encapsulation.

# Brendan Eich (11 years ago)

Mark S. Miller wrote:

Troubles me too. As of ES6 the only possible alternative would be to remove generators from the language. I can't see that happening.

That would be somewhere between "making the perfect the enemy of the good" and "cutting off your nose to spite your face".

If we want call/cc (plus macros to sugar it into usability), there's the door -- it ain't in JS outside of Rhino, and (for good reasons you adduce) it won't be added.

But we have good use cases for generators, including to implement iterators independent of async. programming.

For async we're looking at defer/await for ES7.

This is how living languages evolve. Sorry to preach to the choir (I hope :-P).

# David Bruant (11 years ago)

Le 16/07/2013 19:54, Claus Reinke a écrit :

// this doesn't work
function* generator(){
    [1,2,3].forEach( function(x){ yield x } )
}

I have been thinking and with for..of, I can't find a good reason to use .forEach instead of for..of. for..of does what you need here with generators too.

For the specific case of forEach et al

What do you mean by "et al"? I don't believe .map, .reduce or .filter are any interesting to use alongside generators.

Even if so, for..of can work too and is decently elegant (YMMV):

 function* g(){
     [1,2,3].map(x => {yield transform(x)})
 }

becomes

 function* g(){
     for(x of [1,2,3]) yield transform(x);
 }
# Rick Waldron (11 years ago)

On Tue, Jul 16, 2013 at 3:45 PM, David Bruant <bruant.d at gmail.com> wrote:

I have been thinking and with for..of, I can't find a good reason to use .forEach instead of for..of. for..of does what you need here with generators too.

I've been looking at this example and thinking the same thing. Additionally, I'm curious to know what this would've looked like without being a contrived example of mis-used yield, considering Array.prototype.forEach returns undefined and the callback returns undefined as well (regardless of whether or not user code specifies a return).

# Claus Reinke (11 years ago)

I have been thinking and with for..of, I can't find a good reason to use .forEach instead of for..of. for..of does what you need here with generators too.

Perhaps you're right that .forEach is going to die (there are also generator expressions to consider, covering some of the other standard methods). It was the smallest example I could think of to illustrate the point.

However, the argument is not about a specific operation but about being able to define such operations in user code (eg, array comprehensions can usually be mapped to uses of .map, .concat, .filter; loops can be mapped to tail recursion; ...). User-defined control structures can be extended/modified without waiting for the language as a whole to evolve. If the equivalence between built-in and user-defined operation is broken, that option is no longer fully functional.

What do you mean by "et al"? I don't believe .map, .reduce or .filter are any interesting to use alongside generators.

And why not? Because yield is a statement, and because those operations have not been (cannot be) extended to work with generators. Why shouldn't I be able to traverse an array, using the ES5 standard operations for doing so, yielding intermediate results from the traversal (recall also that yield can return data sent in via .next, for incorporation into such traversals)?

Even if so, for..of can work too and is decently elegant (YMMV):

function* g(){
    [1,2,3].map(x => {yield transform(x)})
}

I fell for this, too:-) arrow functions have no generator equivalents.

becomes

function* g(){
    for(x of [1,2,3]) yield transform(x);
}

Methods can be replaced by built-ins. It is the reverse that is now broken.

# Claus Reinke (11 years ago)

I have been thinking and with for..of, I can't find a good reason to use .forEach instead of for..of. for..of does what you need here with generators too.

I've been looking at this example and thinking the same thing.

That's what you get for trying to use examples:-) long code doesn't get read, short code is taken too seriously. As I said in my reply to David, my point is not dependent on this example. Still, given the readyness to abandon .forEach completely, it might be worthwhile to try and find a more realistic example, to see how big the damage is in practice.

Since we're talking about not completely implemented features, I don't have anything concrete yet, but perhaps in the direction of other callback-based APIs? Is there a way to use generators to enumerate directory trees in nodejs, or is it back to iterators?

Better examples welcome.

# David Bruant (11 years ago)

2013/7/17 Claus Reinke <claus.reinke at talk21.com>

What do you mean by "et al"? I don't believe .map, .reduce or .filter are any interesting to use alongside generators.

And why not? Because yield is a statement, and because those operations have not been (cannot be) extended to work with generators. Why shouldn't I be able to traverse an array, using the ES5 standard operations for doing so, yielding intermediate results from the traversal (recall also that yield can return data sent in via .next, for incorporation into such traversals)?

I think that considering map, filter or reduce (especially reduce!) as traversal mechanism is a misuse of these methods. This is pretty much why find/findIndex has been added (every/some were "misused" as traversal mechanism with early ending to achieve what find/findIndex do).

I fell for this, too:-) arrow functions have no generator equivalents.

... oops, yeah sorry. So happy to use arrow function that I went out of my way :-)

Methods can be replaced by built-ins. It is the reverse that is now broken.

Maybe a solution would be that Array.prototype.map returns a generator when passed a generator as argument? The result generator would generate the new values. Could work with filter too (generate only filtered elements). reduce could empty out the generator to build the value.

function* filterG(e){ yield e %2 === 0 }
function* mapG(e){ yield e *e }
function* reduceG(acc, e){ yield acc + e }

myArray.filter(filterG).map(mapG).reduce(reduceG)

hmm... my code doesn't work because generators don't have a .map and .reduce methods. Maybe we can invent ArrayGenerators? which would be a generator and have Array.prototype someowhere in its prototype chain (array methods would output an ArrayGenerator if provided either a generator or an ArrayGenerator)?

Interestingly, myArray.filter(filterG).map(mapG) could be sort-of a lazy value. Actual computation filterG and mapG happens only when generating values from the

# David Bruant (11 years ago)

[Freaking Gmail shortcuts! Sorry about that]

2013/7/17 David Bruant <bruant.d at gmail.com>

Interestingly, myArray.filter(filterG).map(mapG) could be sort-of a lazy value. Actual computation filterG and mapG happens only when generating values from the

final generator.

This might enable optimizations that aren't currently possible with Array.prototype methods.

# Andy Wingo (11 years ago)

On Wed 17 Jul 2013 10:50, "Claus Reinke" <claus.reinke at talk21.com> writes:

And why not? Because yield is a statement

yield is an expression.

Why shouldn't I be able to traverse an array, using the ES5 standard operations for doing so, yielding intermediate results from the traversal (recall also that yield can return data sent in via .next, for incorporation into such traversals)?

You certainly can, with one modification: using ES6 standard operations (external iterators vs the ES5 forEach internal iterator). Generators and non-generator iterators and for-of and comprehensions hang together really nicely in practice.

Methods can be replaced by built-ins. It is the reverse that is now broken.

Au contraire, but it requires a re-think on your part: favor external iterators over internal iterators.

# Claus Reinke (11 years ago)

And why not? Because yield is a statement

Yield is an expression.

Thanks for the correction. Yes, "yield expr" is an expression, syntactically.

It doesn't have the nice composition and code transformation properties that I usually associate with expressions, it imposes unusual restrictions on its context and impedes functional abstraction:

1 though "yield" constructs expressions from expressions, it isn't a function (can't pass "yield" around or store it in a variable), nor is "yield expr" a function call.

2 1 can be worked around, but not with the usual tools of function definitions and calls - "yield" forces use of "function*" and "yield*" for abstracting over expressions containing it.

3 "yield" is disappointingly similar to "this", in being implicitly bound to the next "function*" ("function", for "this"). Expressions referencing either "this" or "yield" cannot be wrapped in functions (btw, can generator bodies reference an outer "this"?), because this would cut the implicit binding.

For "this", workarounds include arrow functions or "bind", for "yield", 
the only workaround is "yield*"+"function*" (or diving even deeper,
with hand-written iterators). Having to use different function 
mechanisms for the latter is by design, so it is a workaround only 
from the perspective of wanting to use uniform tools for functional 
abstraction.

For instance, we cannot write

function* g() {  (function(){ yield 1 })() }
function* g() {  function y(x){ yield x } y(1) }

but have to write

function* g() {  yield* (function*(){ yield 1 })() }
function* g() {  function* y(x){ yield x } yield* y(1) }

and when I try to write a polyfill for for-of, I end up with two partial fills (neither models early return):

function forof(g,cb) {  // non-generator callbacks only
  var r;
  while(true) {
    r = g.next();
    if (r.done) break; // skip return value?
    cb(r.value);
  }
}

function* forofG(g,cb) {  // generator callbacks only
  var r;
  while(true) {
    r = g.next();
    if (r.done) break; // skip return value?
    yield* cb(r.value);
  }
}

We could switch on the type of cb, and go down to handwritten iteration, to unify the two partials into one, but then we'd still have to cope with different usage patterns at the "call" sites (call with "function" vs. "yield*" with "function*").

Why shouldn't I be able to traverse an array, using the ES5 standard operations for doing so, yielding intermediate results from the traversal (recall also that yield can return data sent in via .next, for incorporation into such traversals)?

You certainly can, with one modification: using ES6 standard operations (external iterators vs the ES5 forEach internal iterator). Generators and non-generator iterators and for-of and comprehensions hang together really nicely in practice.

function* g(){
    for(x of [1,2,3]) yield transform(x);
}

You're suggesting to abandon ES5 array iteration patterns in favor of more general ES6 iterator patterns. That would be okay (*), but

1 it leaves fairly new (ES5) API surface as legacy

2 generators do not compose as freely as iteration functions, because they are tied to special syntax and restricted contexts

(*) if we want to go down that route, then why join TypedArrays with Arrays, according to old-style iteration-API? Shouldn't both be covered by a common iterator-based API instead?

Hand-written iterators don't suffer from 2, but are somewhat awkward to write in place, and expose their lower-level protocol. Perhaps the solution is a rich enough standard iterator library, with generators as local glue and iterator library functions for supporting more general functional abstraction and composition.

Perhaps we need to play a bit more with such iterator library functions, to get a better feeling for the limitations imposed by generators, and to give my concerns a concrete form?

I've put up a gist with a few obvious things I'd want to have (something like "zip" and "feed" should really be standard; the former often has syntax support in the form of "parallel" comprehensions, the latter is needed if we want to use an input-dependent generator in a "for-of"):

https://gist.github.com/clausreinke/6073990

and there are several things I don't like, even at this simple stage:

  • if you compare the versions that use "for-of" with those (ending with a "_") that use a user-defined abstraction "forofG", you'll see a lot of syntax noise, even worse than with the old long-hand "function" - in terms of making functional abstraction readable, this is going in the wrong direction, opposite to arrow functions.

  • I haven't yet figured out how to end an outer generator early from within a "yield*" nested one (as needed for "take_"), without replacing "yield*" with a micro-interpreter. That might just be my incomplete reading of the draft spec, though?

Methods can be replaced by built-ins. It is the reverse that is now broken.

Au contraire, but it requires a re-think on your part: favor external iterators over internal iterators.

My main tool for building abstractions are functions. Generators won't play with functional abstraction. Instead, they force me to use generator abstraction. Using manual iterators instead exposes the low-level protocol details.

I'm not disputing that generators/iterators are more general/ flexible than array iteration methods. I don't know how to combine generators with functional abstraction and composition, so my normal means of building higher-level abstractions are broken.

Perhaps I'm just not using the tools available correctly. Perhaps you could start convincing me by showing me a "forof" implementation that can replace the "for-of" built-in, for all loop bodies?

Claus

# Brendan Eich (11 years ago)

Claus Reinke wrote:

And why not? Because yield is a statement

Yield is an expression.

Thanks for the correction. Yes, "yield expr" is an expression, syntactically.

It doesn't have the nice composition and code transformation properties that I usually associate with expressions, it imposes unusual restrictions on its context and impedes functional abstraction:

1 though "yield" constructs expressions from expressions, it isn't a function (can't pass "yield" around or store it in a variable), nor is "yield expr" a function call.

Same for every other operator.

2 1 can be worked around, but not with the usual tools of function
definitions and calls - "yield" forces use of "function*" and "yield*" for abstracting over expressions containing it.

So does 'return' and this is for a good reason: we are not adding deep continuations (as discussed up-thread).

3 "yield" is disappointingly similar to "this", in being implicitly bound to the next "function*" ("function", for "this"). Expressions referencing either "this" or "yield" cannot be wrapped in functions (btw, can generator bodies reference an outer "this"?), because this would cut the implicit binding.

No "binding" in the common sense of that word.

Again: same as 'return'.

For "this", workarounds include arrow functions or "bind", for "yield", the only workaround is "yield*"+"function*" (or diving even deeper, with hand-written iterators).

This rehashes a pointless lament that we don't have deep continuations.

Having to use different function mechanisms for the latter is by design, so it is a workaround only from the perspective of wanting to use uniform tools for functional abstraction.

For instance, we cannot write

function* g() { (function(){ yield 1 })() } function* g() { function y(x){ yield x } y(1) }

but have to write

function* g() { yield* (function*(){ yield 1 })() } function* g() { function* y(x){ yield x } yield* y(1) }

Same as 'return'.

We discussed escape continuations:

strawman:return_to_label

This strawman is not on any roadmap. It did not fare well in past TC39 meetings and discussions.

and when I try to write a polyfill for for-of,

Don't do that!

New special forms require compilers.

# Andy Wingo (11 years ago)

On Wed 24 Jul 2013 22:07, "Claus Reinke" <claus.reinke at talk21.com> writes:

2 generators do not compose as freely as iteration functions, because they are tied to special syntax and restricted contexts

You place blame on generators here, but beside the laments about deep coroutines -- totally understandable, but Brendan is right that that they are pointless -- your examples apply just as well to iterators of all kinds. It just happens that generators are a convienient way to implement iterators.

Your point sounds like "external iteration does not compose as freely as internal iteration" -- which is strictly not true! You can't implement zip, for example, with internal iteration, whereas you can with external iteration.

  • if you compare the versions that use "for-of" with those (ending with a "_") that use a user-defined abstraction "forofG", you'll see a lot of syntax noise, even worse than with the old long-hand "function" - in terms of making functional abstraction readable, this is going in the wrong direction, opposite to arrow functions.

I humbly suggest that these abstractions are simply in the wrong place. The defining characteristic of an external iterator is that the consumer is external to the producer. Trying to push the consumer inside is going to lead to contorted code, as you have seen.

  • I haven't yet figured out how to end an outer generator early from within a "yield*" nested one (as needed for "take_"), without replacing "yield*" with a micro-interpreter. That might just be my incomplete reading of the draft spec, though?

"I haven't yet figured out how to end an outer function early from within a nested function call, without replacing the call site with a micro-interpreter."

:)

# Claus Reinke (11 years ago)

2 generators do not compose as freely as iteration functions, because they are tied to special syntax and restricted contexts

You place blame on generators here, but beside the laments about deep coroutines -- totally understandable, but Brendan is right that that they are pointless -- your examples apply just as well to iterators of all kinds. It just happens that generators are a convienient way to implement iterators.

First, let me clarify that I am on record arguing for even shallower continuations. I restated this preference in this very thread

esdiscuss/2013-July/031967 (scroll down to FYI)

I have no idea why both you and Brendan assert that I was arguing/ rehashing for deep delimited continuations (let alone call/cc). But as this makes two of you, I've extracted my suggestions/questions to a separate thread, where they are less likely to be lost/misread.

Second, my worries are about generators, because they introduce a special built-in form that interferes with functional abstraction.

Your point sounds like "external iteration does not compose as freely as internal iteration" -- which is strictly not true! You can't implement zip, for example, with internal iteration, whereas you can with external iteration.

My point is about expressiveness in the composition/abstraction/ refactoring/extract reusable components sense, not in the Turing sense. Also,

zip = (a,b)=>a.reduce(function(x,y,i){return x.concat([[y,b[i]]])},[])
  • if you compare the versions that use "for-of" with those (ending with a "_") that use a user-defined abstraction "forofG", you'll see a lot of syntax noise, even worse than with the old long-hand "function" - in terms of making functional abstraction readable, this is going in the wrong direction, opposite to arrow functions.

I humbly suggest that these abstractions are simply in the wrong place.

Since I was merely trying to re-implement parts of "for-of" in user land, I don't see how that could be the case.

Claus

# Brendan Eich (11 years ago)

Claus Reinke wrote:

I have no idea why both you and Brendan assert that I was arguing/ rehashing for deep delimited continuations (let alone call/cc).

Because you wrote:

"1 can be worked around, but not with the usual tools of function
definitions and calls - "yield" forces use of "function*" and "yield*" for abstracting over expressions containing it."

and later:

"For instance, we cannot write

function* g() {  (function(){ yield 1 })() }"

This certainly looks like you want a deep continuation.

# Claus Reinke (11 years ago)

I have no idea why both you and Brendan assert that I was arguing/ rehashing for deep delimited continuations (let alone call/cc).

Because you wrote:

"1 can be worked around, but not with the usual tools of function
definitions and calls - "yield" forces use of "function*" and "yield*" for abstracting over expressions containing it."

and later:

"For instance, we cannot write

function* g() { (function(){ yield 1 })() }

This certainly looks like you want a deep continuation.

Ah, thanks, that explains it. Yes, I was listing examples that trouble me about the current design, I was suggesting changes to that design, and some of the examples could only be solved by deeper continuations.

The point of departure is that my suggested changes wouldn't actually solve all the cases that trouble me (in particular, not those cases that would depend on deep continuations). Issues that I've tried to address include:

  1. the example above doesn't require "deep" continuation any more than local variable declarations in a generator require them.

    Let me change the example to use immediately applied arrow functions (to avoid any special once-per-function-body handling of "this" and "arguments"); then I would expect

     function* g() {  (()=>{ let x = 1; yield x })() }
    

    to be equivalent to

     function* g() {  { let x = 1; yield x } }
    

    and if the latter is considered valid/shallow, I would expect the former to be valid/shallow, too.

  2. I'd like to decouple generators from "function", to avoid interference between the two features.

    For concreteness, let me assume a block form of generators as "do* { ... }" (delimiting continuations to the block, giving a generator- valued expression). Then the example would read (ignoring item 1 above for now, so we have to use "yield*"):

     var g = () => do* { yield* (()=> do* { yield 1 } )() }
    

    With "function", this would be slightly longer than with the current spec, but since generators are now decoupled from "function", we can use (arrow) functions (our means of functional abstraction) freely - generators are simply another class of object to write functions over.

    We could even re-introduce

     function* f() { ... }
    

    as mere syntactic sugar for

     function f() { return do* { ... } }
    
  3. I'd like to see a standard iterator library, with things like zip and feed (the exact contents of such a library would evolve in practice, not from a spec, but the spec could provide a seed, and organize the evolution), and I would like to see more support for composing generators.

    Using the current spec, we could define

     function* then(g1,g2) { yield* g1; yield* g2 }
    

    and use this to combine generators via ES5 array iterations

     [1,2,3].map(function*(x) { yield x }).reduce(then)
    

    or, assuming item 2 above,

     function then(g1,g2) { return do* { yield* g1; yield* g2 } }
    
     [1,2,3].map(x=>do* { yield x }).reduce(then)
    

    This, as well as my generators-as-monads gist, suggest that we could let generators return their completion value and have them implement monadic .then, for easy composition using the monadic set of tools.

    And since "yield*" is essentially a mini-interpreter built on top of "yield", the composition library could include alternative interpreters (eg, support for early return).

So, none of my suggestions require deep continuations. Nevertheless, I'm having trouble distinguishing local blocks in shallow-continuation generators from deep-continuation generators. So I'd be interested to hear the precise arguments against deep delimited continuations (link to meeting notes/mailing list thread would be fine).

Claus