Always close iterators?
Axel Rauschmayer wrote:
Given that redundant calls to
return()
don’t make a difference (h/t Bergi)
I'm sorry, that was not 100% accurate. I only referred to .return(x)
returning {done:true, value:x} and .throw(e)
being equivalent to
throw e;
when the generator was never started or is already completed
(people.mozilla.org/~jorendorff/es6-draft.html#sec-generatorresumeabrupt).
In fact, there are generators that behave differently when being
prematurely aborted and attempted to be closed multiple times.
A contrived example:
function* unstoppableCounter(n) {
try {
while (true)
yield n++;
} finally {
console.log("unclosable!");
yield* unstoppableCounter(n);
}
}
var counter = unstoppableCounter(0);
var i=5;
for (var x of counter) {
console.log(x);
if (!--i) break;
}
i=4;
for (var x of counter) {
console.log(x);
if (!--i) break;
}
Every call of counter.return()
here would actually log "unclosable!",
increase the counter and yield {done:false, value:…} - in general, might
have side effects. So we shouldn't do it arbitrarily often.
couldn’t the iteration protocol be simpler if iterators were always closed. “Manually implemented” iterators could be written without the check in line (A)
I don't think that's how an explicit iterator would be written. Shouldn't it look more like
…
next() {
if (iterationIsDone()) {
cleanUp();
return {done: true};
} else {
return {value: nextValue(), done: false};
}
}
Of course that assumes that we don't have a return value, and next()
is no more called after it returned done: true
once (otherwise we'd
clean up multiple times).
Maybe better:
…
next() {
if (iterationIsDone()) {
return {done: true};
} else {
let result = {value: nextValue(), done: iterationIsDone()};
if (result.done) cleanUp(); // (B)
return result;
}
}
Admittedly, that has the line you argued against again…
I'm sorry, that was not 100% accurate.
Right, ignore that part of my email. My main argument is: “wouldn’t the iteration protocol be simpler if iterators were always closed (versus only if iteration finishes abruptly)?”
Hm, I can't see much harm in that, it seemd to simplify the implementation of iterators indeed. It changes the semantics from
| cleanup must be done before returning the iterresult
(that's how finally
works) to
| after having returned the iterresult, you will be made to clean up
Or, actually, that doesn't sound as good. "Some will hopefully tell you" to clean up is not what looks like good, resilient design.
While it does indeed simplify the producer implementation, the consumer side does get more complicated - from
for (let iter=getIterator(), value, done; {value,done}=iter.next(), !done; ) {
console.log(value);
// Assuming this never throws
}
to
let iter = getIterator();
for (let value, done; {value,done}=iter.next(), !done; ) {
console.log(value);
// Assuming this never throws
}
iter.return(); // additional line required
Of course, if done properly (accounting for exceptions) the change is minimal - from try-catch to try-finally. But who does that, how often is it forgotten?
Given that redundant calls to
return()
don’t make a difference (h/t Bergi), wouldn’t the iteration protocol be simpler if iterators were always closed (versus only if iteration finishes abruptly). The code of generators wouldn’t change (finally
etc.), but “manually implemented” iterators could be written without the check in line (A). They’d become simpler and more similar to generators andfinally
.let iterable = { [Symbol.iterator]() { return { next() { if (iterationIsDone()) { return { done: true }; } else { let result = { value: nextValue(), done: false }; if (iterationIsDone()) { // (A) cleanUp(); } return result; } }, return(value) { if (! iterationIsDone()) { cleanUp(); } return { value: value, done: true }; } } } }