Proposal: rest operator in middle of array

# Ethan Resnick (5 years ago)

Long-time mostly-lurker on here. I deeply appreciate all the hard work that folks here put into JS.

I've run into a couple cases now where it'd be convenient to use a rest operator at the beginning or middle of an array destructuring, as in:

const [...xs, y] = someArray;

Or, similarly, in function signatures:

function(...xs, y) { }

The semantics would be simple: exhaust the iterable to create the array of xs, like a standard rest operator would do, but then slice off the last item and put it in y.

For example, I was working with some variable argument functions that, in FP style, always take their data last. So I had a function like this:

function match(...matchersAndData) {
  const matchers = matchersAndData.slice(0, -1);
  const data = matchersAndData[matchersAndData.length - 1];
  // do matching against data
}

Under this proposal, the above could be rewritten:

function reduce(...matchers, data) { /* ... */ }

Another example: a function pad, which takes a target length and a string to pad, with an optional padding character argument in between:

function pad(targetLength, ...paddingCharAndOrData) {
  const [paddingChar = " "] = paddingCharAndOrData.slice(0, -1);
  const data = paddingCharAndOrData[paddingCharAndOrData.length - 1];

  // pad data with paddingChar to targetLength;
}

With this proposal, that could be rewritten:

function pad(targetLength, ...opts, data) {
  const [paddingChar = " "] = opts;
  // pad data with paddingChar to targetLength;
}

I'm curious if this has been considered before, and what people think of the idea.

Obviously, if ...a appeared at the beginning or middle of a list, there would have to be a fixed number of items following it, so a subsequent rest operator in the same list would not be allowed.

# Andy Earnshaw (5 years ago)

This has come up several times and, while it seems pretty intuitive to me, not everyone seems to agree. You can check the archives for previous discussions.

# guest271314 (5 years ago)

I've run into a couple cases now where it'd be convenient to use a rest operator at the beginning or middle of an array destructuring, as in:

const [...xs, y] = someArray;

The semantics would be simple: exhaust the iterable to create the array of xs, like a standard rest operator would do, but then slice off the last item and put it in y.

const [[...xs], y = xs.pop()] = [someArray];

or

let condition = "b";
const [[...xs], [y] = xs.splice(xs.findIndex(v => v === condition), 1)] = [someArray];

For example, I was working with some variable argument functions that, in FP style, always take their data last. So I had a function like this:

function match(...matchersAndData) {
 const matchers = matchersAndData.slice(0, -1);
 const data = matchersAndData[matchersAndData.length - 1];
 // do matching against data
}

Under this proposal, the above could be rewritten:

function reduce(...matchers, data) { /* ... */ }

A similar pattern can be applied at function arguments scope by dynamically setting default parameter

function match(matchers, data = matchers.pop()) {
  console.log(matchers, data);
}

match(`/a/ /b/ /c/ data`.split` `)

Can you describe the difference between pad function and match function?

--

If gather pad function accurately you can use object destructuring with default parameters, see stackoverflow.com/a/43637164

function pad({targetLength = 1, opts = [], paddingChar = opts.length ? opts.shift() : " ", data = void 0} = {}) {
  console.log(targetLength, paddingChar, opts, data);
}

pad({opts:`  /a/ /b/ /c/`.split(/\s(?!\s)/)})
# kai zhu (5 years ago)

-1 for maintainability and debuggability

  1. maintainability if you want to extend the function with additional args, then you'll have to retroactively modify all existing calls to avoid off-by-one argcount:
// if extending function with additional args
function pad(targetLength, ...opts, data) ->

function pad(targetLength, ...opts, data, meta)

// then must retroactively append null/undefined to all existing calls
pad(1, opt1, opt2, "data") ->

pad(1, opt1, opt2, "data", null)
  1. debuggability when debugging, it takes longer for human to figure out which arg is what:
// function pad(targetLength, ...opts, data)
pad(aa, bb, cc, dd);
pad(aa, bb, cc, dd, ee);

// vs

// function pad(targetLength, opts, data)
pad(aa, [bb, cc], dd);
pad(aa, [bb, cc, dd], ee);
# Isiah Meadows (5 years ago)

For your maintainability argument: adding extra arguments to those functions is something I almost never do. And you'd have the same exact issue with final rest parameters, just in a different position (in the middle as opposed to in the end).

For debuggability, I don't see how it'd be a major issue unless you already have an excessive number of positional parameters. In my experience, the debuggability issues arise approximately when there's just too many positional parameters, and factoring out the rest parameter to an array doesn't really help this situation much. (That's when object destructuring comes in handy.)

So not convinced either is any different than what it's like today.

Also, you aren't obligated to use a feature just because it exists - I hardly ever use proxies, for instance, and I rarely need maps beyond what objects give me, so I don't normally use them unless I need to have reference types or mixed types as keys.


Isiah Meadows contact at isiahmeadows.com, www.isiahmeadows.com

# kai zhu (5 years ago)

it matters when you have to debug/inherit other people's code (and clean up their mess). i wouldn't enjoy debugging unfamiliar-code that used this feature (admittedly, its my subjective opinion).

the maintainability argument stands -- its counter-intuitive in javascript that appending extra args to a function re-arranges arg-positioning and invalidates existing calls.

debuggability is subjective i agree.

p.s. - in general, i don't see what real painpoint rest-operator actually address, that couldn't be solved with arguments. variable-arg-length functions are not javascripty -- they frequently require extra ux-workflow transformations like Function.p.apply or Array.p.flatmap to the arguments being passed.

# guest271314 (5 years ago)

its counter-intuitive in javascript that appending extra args to a function re-arranges arg-positioning and invalidates existing calls.

Appending extra arguments to a function is not an issue. A generator now be can be passed to a function

function* gen(n = 0) { while (true) { yield ++n } }

match(...gen())

where the author should reasonably expect (or will discover) that subsequent arguments could affect previous arguments; which could, in general, be the purpose of "appending" the extra arguments.

From what have been able to gather (if reading the below linked answer correctly) the arguments to a function are processed from right to left; from the inner-most nested value (Does any JavaScript specification define the order of function execution where the arguments to the function are a nested array of function calls? #1397 tc39/ecma262#1397).

The proposal is technically already possible using existing JavaScript features (default values; default parameters; ... syntax

fun(a, b, ...c): This construct doesn't actually have a name in the spec (www.ecma-international.org/ecma-262/6.0/#sec-left-hand-side-expressions). But it works very similar as spread elements do: It expands an iterable into the list of arguments. It would be equivalent to func.apply(null, [a, b].concat(c)).

(stackoverflow.com/a/37152508); generator function; Object.assign(); computed property name; destructuring assignment; et al.).

# Ethan Resnick (5 years ago)

This has come up several times and, while it seems pretty intuitive to

me, not everyone seems to agree. You can check the archives for previous discussions.

@Andy Perhaps you can provide some links? I found two esdiscuss.org/topic/an-array-destructing-specification-choice#content-69

threads esdiscuss.org/topic/early-spread-operator — both 8 years

old — that talked about this, along with one more recent one esdiscuss.org/topic/strawman-complete-array-and-object-destructuring

that didn't get very far. In the first two threads, commenters brought up one case where the semantics are unclear (i.e., when there are more listed binding elements than there are elements in the iterable), and there was some talk about implementation complexity. But there was also some interest from some big contributors to the spec. So I wonder if it's time to revisit this?

if you want to extend the function with additional args, then you'll have to retroactively modify all existing calls to avoid

off-by-one argcount:

@Kai I'm not sure I follow. If the new argument is required, you have to modify all existing calls whether the new argument goes at the end or as the second to last argument. If the new argument is optional, then adding it as the second to last argument doesn't break existing calls at all, assuming the function accounts for the fact that the optional arguments are all the ones after the initial required ones, and up until (but excluding) the last one. The syntax I'm proposing makes adding such extra arguments easy. In other words:

function pad(targetLength, ...opts, data) {
  const [paddingChar = " "] = opts;
  // pad data with paddingChar to targetLength;
}

would, with the addition of an optional "meta" arg, become:

function pad(targetLength, ...opts, data) {
  const [paddingChar = " ", meta = { /* some default */ }] = opts;
  // pad data with paddingChar to targetLength;
}

More importantly, though, this data-last calling pattern has a long history, and there are some cases where it's the best solution. My use case was similar to the common one, namely, that I was building a data pipeline, with the "settings" arguments partially applied from the left to create the function to use in each step. (And, in my case, I was building each function in a DSL where it would've been very inconvenient to add arbitrary-order partial application, even if JS were to add something like that tc39/proposal-partial-application.)

Given that functions where the last argument is significant probably aren't going away, it'd be nice imo if the JS syntax supported this better.

# guest271314 (5 years ago)

The proposed pattern is already possible, as demostrated at the code at esdiscuss.org/topic/proposal-rest-operator-in-middle-of-array#content-2 .

How does the code at that post specifically not provide the same functionality as proposed?

# guest271314 (5 years ago)

but imo it really masks the author's intention (in addition to being more verbose and less efficient) when compared to:

const [...xs, y] = someArray;

No intention is "masked". Yes, more verbose, though demonstrates that the expected output is now possible using current version of JavaScript.

"less efficient" is a statement which requires evidence. Compared to what? How is "efficient" benchmarked and measured and compared?

The proposed solutions for argument lists strike me as less-good.

Again, the term "less-good" is entirely subjective. There is no "good" or "bad" code, there is only code. The opinion of "good" or "bad" is in the individual human mind, not the code, and is subject to change from one moment to the next.

function match(matchers, data = matchers.pop()) {
  console.log(matchers, data);
}
const x = [[1,2],3];
match(x);
console.log(x); // [[1,2]], uh oh, x has changed.

As mentioned in the post, the same pattern as const [[...xs], y = xs.pop()] = [someArray]; can be used

function match([...matchers], data = matchers.pop()) {
  console.log(matchers, data);
}
const x = [[1,2],3];
match(x);
console.log(x); // [[1,2]] no mutation

Each of the examples is already achievable using JavaScript shipped with the browser. Could not actually determine what the issue is with the existing code. It is already possible to compose code where the center, or any argument identifer can be spread to an array. If the input is individual arguments to a function, those arguments, too, can be converted to an array within the arguments scope; as it is possible to run functions in the argument scope which change the current and subsequent parameters. If the concept is dynamic and arbitrary arguments you can use object destructuring, default parameters, default values, Object.assign(), or if necessary, a generator function.

# Andy Earnshaw (5 years ago)

On Mon, 10 Jun 2019 at 22:20, Ethan Resnick <ethan.resnick at gmail.com> wrote:

@Andy Perhaps you can provide some links? I found two esdiscuss.org/topic/an-array-destructing-specification-choice#content-69 threads esdiscuss.org/topic/early-spread-operator — both 8 years old — that talked about this, along with one more recent one esdiscuss.org/topic/strawman-complete-array-and-object-destructuring that didn't get very far. In the first two threads, commenters brought up one case where the semantics are unclear (i.e., when there are more listed binding elements than there are elements in the iterable), and there was some talk about implementation complexity. But there was also some interest from some big contributors to the spec. So I wonder if it's time to revisit this?

Here's the one that I was involved in:

esdiscuss.org/topic/rest-parameters

My comment was not meant to be dismissive, I was just hoping to provide some context (and I was on my mobile at the time). If several discussions have tailed off over the years without anyone making a concrete proposal, then it seems unlikely that this discussion will make any more progress than the ones before it. I think the real limiting factor is that this is not a burning issue for anyone, not a particular pain point in day to day programming, so there's no big appetite to try and overcome the main objections.

# guest271314 (5 years ago)

What are the complete expected input and output of match and pad functions?