An update on rest operator ?
Previous disussions:
esdiscuss.org/topic/strawman-complete-array-and-object-destructuring
esdiscuss.org/topic/rest-parameters
-- T.J. Crowder
Thanks a lot.
I was not sure if this this would be related to rest operator of to destructuring...
And I wasn't well aware of «The reason this doesn't work is because ... in this context is not array destructuring - it's iterable destructuring.»
I would love to read a reply to the last message (from Isiah in this thread: esdiscuss.org/topic/strawman-complete-array-and-object-destructuring)
-- Siegfried Ehret
I still think it's silly that [...rest, last]
isn't allowed, the point
that it's "iterable destructuring" is actually irrelevant as if there's a
spread it's always coerced to an array e.g.:
function* nums() {
yield 1
yield 2
yield 3
yield 4
}
const [first, ...rest] = nums()
Array.isArray(rest) // true, it's not an iterator for the rest of the nums,
// it's just an Array of [2,3,4]
Given the above, it should really just be the case that [...rest, last]
would just be the same as destructuring the reversed array, then
re-reversing the rest part e.g.:
const [...rest, secondLast, last] = someIterable
// Would be equivalent to
const __arr = Array.from(someIterable).reverse()
const [last, secondLast, ...rest] = __arr
rest.reverse() // Reverse it back to correct order
Really the only contentious case is if it's in the middle because then you have to decide which direction to consume from e.g.
const [a, ...rest, c] = [1]
// Possible values for a, c
// 1, 1
// 1, undefined
// undefined, 1
Personally I'd be quite happy to at least get the [...rest, last]
case
even if the middle case couldn't be agreed upon, but someone in tc39 would
need to champion it.
On Thu, Aug 3, 2017 at 5:44 AM, James Browning <thejamesernator at gmail.com> wrote:
Given the above, it should really just be the case that
[...rest, last]
would just be the same as destructuring the reversed array, then re-reversing the rest part
Or the other way to think of it, since as you say it's going to end up in an array anyway and by the time the expression is parsed, the number of identifiers after the rest identifier is known:
const [...rest, secondLast, last] = someIterable;
becomes
const a = [...someIterable];
const rest = a.slice(0, -2);
const [secondLast, last] = a.slice(-2);
(Theoretically; presumably the temporary arrays would be optimized out.)
Really the only contentious case is if it's in the middle because then you have to decide which direction to consume from e.g.
const [a, ...rest, c] = [1] // Possible values for a, c // 1, 1 // 1, undefined // undefined, 1
I understand the last two, where presumably rest
is []
(but would
strongly argue for 1
, []
, undefined
-- e.g., greediness), but what's
the logic that would explain a = 1
, c = 1
? That doesn't seem to make
any sense.
-- T.J. Crowder
The 1, 1 would happen if you decided that [a, ...rest, b]
read in both
directions (although personally I'm not a fan of this approach) e.g.
const arr = [1]
const [a, ...rest, b] = arr
// Roughly equivalent to:
const [a] = arr.slice(0, 1)
const [c] = arr.slice(-1) // So they get duplicated
const rest = arr.slice(1, -1) // Empty
// ---- Similarly for a longer array
const arr = [1, 2]
const [a, b, ...rest, c, d] = arr
// Would be roughly equivalent to
const [a,b] = arr.slice(0, 2)
const [d, c] = arr.slice(-2).reverse()
const rest = arr.slice(2, -2) // Which is empty in this case
One option could be (although I don't like it either) to allow the rest operator to have a direction e.g.:
const [a, ...rest, b] = [1] // a -> 1, b -> undefined
// And the other way
const [a, rest..., b] = [1] // a -> undefined, b -> 1
Personally I think that'd make it more confusing, but it's potentially an option.
Another option could even be that [a, ...rest, b]
simply throws on an
iterable with less than 2 items, but that's not consistent with the current
behavior of [a, b]
not throwing on iterables with less than 2 items.
I think T.J. had the most intuitive logic (and this has been mentioned in previous threads too), where non-rest parameters have priority:
const [a, ...rest, c] = [1] // -> 1, [], undefined
const [a, ...rest, c] = [1, 2] // -> 1, [], 2
const [a, ...rest, c] = [1, 2, 3] // -> 1, [2], 3
If you think of rest as "everything else" (which is what it already is) then this feels pretty natural and is easy to reason about.
On Thu, Aug 3, 2017 at 10:18 AM, James Browning <thejamesernator at gmail.com> wrote:
The 1, 1 would happen if you decided that
[a, ...rest, b]
read in both directions (although personally I'm not a fan of this approach)...
Gotcha. Yeah, having it duplicate things would seem wrong. :-)
One option could be (although I don't like it either) to allow the rest operator to have a direction...
I'd say: Keep it simple. Left-to-right, non-duplicating, non-greedy with respect to non-rest bindings, greedy otherwise:
function *source(len) {
for (let n = 1; n <= len; ++n) {
yield n;
}
}
function test(len) {
const [ a, ...rest, b, c ] = source(len);
console.log("With " + len + ":", a, rest, b, c);
}
test(0); // With 0: undefined, [], undefined, undefined
test(1); // With 1: 1, [], undefined, undefined
test(2); // With 2: 1, [], 2, undefined
test(3); // With 3: 1, [], 2, 3
test(4); // With 4: 1, [2], 3, 4
test(5); // With 5: 1, [2, 3], 4, 5
On Thu, Aug 3, 2017 at 10:29 AM, Andy Earnshaw <andyearnshaw at gmail.com> wrote:
If you think of rest as "everything else" (which is what it already is) then this feels pretty natural and is easy to reason about.
Exactly. (And as you say, hardly original with me.)
A pragmatic approach could simply consume the rest of the iterable into the rest binding and then if there are more bindings after it, move those entries into them:
function assignToBindings(bindings, iterator) {
let bindingIndex = 0;
let currentBinding;
let e;
// Consume bindings prior to rest
while ((currentBinding = bindings[bindingIndex++]) && !currentBinding.isRest) {
e = iterator.next();
if (e.done) {
return;
}
currentBinding.value = e.value;
}
if (!currentBinding) {
return; // Out of bindings
}
// Read to the end into the rest binding
assert(currentBinding.isRest, "hit rest binding");
const rest = currentBinding.value;
while (!(e = iterator.next()).done) {
rest.push(e.value);
}
if (bindingIndex >= bindings.length) {
return; // rest binding was last binding
}
// Move trailing entries out of the rest binding into the trailing bindings
const restLength = Math.max(0, rest.length - (bindings.length - bindingIndex));
for (let restIndex = restLength; restIndex < rest.length; ++restIndex) {
bindings[bindingIndex++].value = rest[restIndex];
}
rest.length = restLength;
}
-- T.J. Crowder
Hello,
This is a small idea I had recently: to update the rest operator to make it work not only to get the last elements.
As an example, we already have:
const myArray = [1, 2, 3, 4, 5, ..., 99, 100]; [first, second, ...rest] = myArray; // first = 1 // second = 2 // rest = [3, 4, 5, ..., 99, 100]
It would be interesting to have:
[...rest, previous, last] = myArray; // rest = [1, 2, 3, ..., 97, 98] // previous = 99 // last = 100
And:
[first, ...rest, last] = myArray; // first = 1 // rest = [2, 3, ..., 97, 98, 99] // last = 100
Another use case: inside a variadic function to separate the callback (often the last argument) and the function parameters.
I asked myself «why would I want the last elements of an array», and my answer was «for the same reason which motivates me to get the first ones».
What do you think ?