Array.prototype.replace

# Ben Wiley (6 years ago)

folks,

I'm brand new to this process and I've drafted a proposal (and polyfill) for a new Array.prototype.replace method. In the vein of Array concat, String replace and the new Object rest spread, Array replace provides a way to shallow clone an array and replace an element in a single statement.

Here's the GitHub repo, which outlines API, motivation, use cases, alternatives, and polyfill usage: benwiley4000/array-replace

Array replace shouldn't be considered an across-the-board substitute for Array.prototype.splice (which mutates the array) and wouldn't be ideal for performance critical applications handling very large arrays. It is well suited for application logic where immutability is desired - a trend growing in the Javascript community.

Most of the application areas of object rest spread composition syntax (for building objects, not destructuring them) are also application areas for Array.prototype.replace, when your object happens to be an array.

I'm looking forward to any feedback and I'd also love for a champion to step in and help, if you think this is an ok idea.

Thanks! Ben

# Isiah Meadows (6 years ago)

I like the idea, but there's a few tweaks I think could be made:

  1. replace reads like you're doing a string or subarray replace. This proposal sounds more like an update.
  2. If you allow it to carry the semantics of slice() + array[index] = value, engines can implement it a bit quicker.
  3. IMHO, this belongs as syntax, either in addition or in place of this proposal's method. If nothing else, it's for consistency with object spread, but it also allows you to spread iterables similarly, something like [...iter, index: value]. We could also introduce that as an array pattern type, so we don't need to use elisions so frequently to skip values in array destructuring patterns (think: regexp.exec(string) results, when you don't care about all the groups). There's been times where I've had upwards of 3-4 elisions all clustered together, and that gets unreadable in a hurry.

Isiah Meadows me at isiahmeadows.com, www.isiahmeadows.com

# Ben Wiley (6 years ago)

Oops, meant to send to the rest of the list.

Thanks for the feedback Isiah.

  1. Fair!
  2. Could you elaborate? I believe that's what this is, though I might be missing your point.
  3. a. As I noted on GitHub I also considered that syntax originally. I'm torn on it because allowing out-of-order index definitions in an array little seems a bit wacky, but maybe it's not so bad? b. Hmm, this sounds interesting but I'm not sure I totally follow. Could you give a more convenient concrete example of the array pattern type for the example you're discussing?

Ben

Le mar. 10 juill. 2018 00 h 15, Isiah Meadows <isiahmeadows at gmail.com> a écrit :

# Ben Wiley (6 years ago)

P.S. sorry for the double post but should mention the word "convenient" came from nowhere (except autocorrect) in my last message. I wasn't calling your example "inconvenient." :)

Le mar. 10 juill. 2018 01 h 26, Ben Wiley <therealbenwiley at gmail.com> a écrit :

# T.J. Crowder (6 years ago)

The standard library already handles doing array-copy-and-update as a one-liner via Object.assign (jsfiddle.net/ryqtvbdk):

const original = [1, 2, 3, 4];
const updated = Object.assign([...original], {1: 4, 3: 42});
// Or: const updated = Object.assign([], original, {1: 4, 3: 42});
console.log(updated); // [1, 4, 3, 42]

Like Isiah, I think I'd prefer it as syntax. I'm not an engine implementer so I have no idea how hard it would be to do this to an array initializer:

const original = [1, 2, 3];
const updated = [...original, 1: 4];
console.log(updated); // [1, 4, 3]

...but that's what I'd like to see. Parallels the object initializer. Currently invalid syntax, so safe to add from that perspective. And it enhances destructuring as well (since array initializer syntax is used for destructuring):

const original = [1, 2, 3];
const [1: foo, ...rest] = original;
console.log(foo); // 2
console.log(rest); // [1, 3]

(Note that rest is an array, whereas with an object destructuring pattern, it would be a non-array object.)

That syntax would also provide expressive creation of sparse arrays, e.g.:

const array = [2: 42];
console.log(array); // [, , 42];

-- T.J. Crowder

# Michał Wadas (6 years ago)

Personally I would prefer signature Array.prototype.replace(map : Map).

gist.github.com/Ginden/bf628e88be0886ac8340bdecf16e98f3

# Ben Wiley (6 years ago)

T.J., thanks for pointing out that Object.assign provides the same functionality. I'd neglected to consider that the first argument needn't be a plain object.

Interesting that folks seems to prefer the syntax variant more than I had expected.

As for the application of providing a destructuring parallel.. well, you've convinced me it could be useful. :)

Here's a spec question: must the keys specified be numbers? The application is questionable but I say anything could be allowed. E.g.

const arr1 = [1,2,3]
const arr2 = [...arr1, foo: 'bar' ]

Or

const [ 1: middle, foo, ...arr3 ] = arr2
console.log(middle, foo, arr3) // 2 "bar" [1, 3]

So array rest spread would provide totally parallel functionality to object rest spread with the key difference that result objects are arrays instead of objects.

Thoughts?

Ben

Le 10 juill. 2018 02 h 48, "T.J. Crowder" <tj.crowder at farsightsoftware.com>

a écrit :

Hi,

The standard library already handles doing array-copy-and-update as a one-liner via Object.assign (jsfiddle.net/ryqtvbdk):

const original = [1, 2, 3, 4];
const updated = Object.assign([...original], {1: 4, 3: 42});
// Or: const updated = Object.assign([], original, {1: 4, 3: 42});
console.log(updated); // [1, 4, 3, 42]

Like Isiah, I think I'd prefer it as syntax. I'm not an engine implementer so I have no idea how hard it would be to do this to an array initializer:

const original = [1, 2, 3];
const updated = [...original, 1: 4];
console.log(updated); // [1, 4, 3]

...but that's what I'd like to see. Parallels the object initializer. Currently invalid syntax, so safe to add from that perspective. And it enhances destructuring as well (since array initializer syntax is used for destructuring):

const original = [1, 2, 3];
const [1: foo, ...rest] = original;
console.log(foo); // 2
console.log(rest); // [1, 3]

(Note that rest is an array, whereas with an object destructuring pattern, it would be a non-array object.)

That syntax would also provide expressive creation of sparse arrays, e.g.:

const array = [2: 42];
console.log(array); // [, , 42];

-- T.J. Crowder

# T.J. Crowder (6 years ago)

On Tue, Jul 10, 2018 at 1:22 PM, Ben Wiley <therealbenwiley at gmail.com> wrote:

Here's a spec question: must the keys specified be numbers? The application is questionable but I say anything could be allowed. E.g. ... So array rest spread would provide totally parallel functionality to object rest spread with the key difference that result objects are arrays instead of objects.

I'd call it a minor point. But off-the-cuff:

There's always a hopefully-creative tension between A) not unnecessarily limiting things, and B) YAGNI and/or not handing people footguns.

In the "don't unnecessarily limit" column:

  • The ship has already sailed in terms of people confusing arrays and objects in JavaScript.
  • Not limiting to array index property names should mean the same parsing structures and code can be used.
  • I don't like unnecessary runtime checks, and the check that the property name is an array index would have to be at runtime, not parse-time, because of computed property names.
  • Standard array indexes are officially strings anyway (though we write them as numbers and they get optimized that way most of the time).
  • [length: 10] has a certain seductive quality about it.

In the YAGNI and/or footgun column:

  • People already get objects and arrays confused enough! At least right now, if they write [foo: "bar"], they get a syntax error (though of course {0: "bar"} is perfectly valid). Don't hand them yet another footgun.
  • As you say, application is questionable.

-- T.J. Crowder

# Ben Wiley (6 years ago)

Le mar. 10 juill. 2018 09 h 14, T.J. Crowder < tj.crowder at farsightsoftware.com> a écrit :

# Ben Wiley (6 years ago)

Hm, despite the fewer number of points in the cons category I'm persuaded by the argument that we don't want people getting arrays and objects confused. Might be best to limit that until there is a compelling use case which there might not be.

Ben

Le mar. 10 juill. 2018 09 h 16, Ben Wiley <therealbenwiley at gmail.com> a écrit :

# T.J. Crowder (6 years ago)

On Tue, Jul 10, 2018 at 2:18 PM, Ben Wiley <therealbenwiley at gmail.com> wrote:

Hm, despite the fewer number of points in the cons category I'm persuaded by the argument that we don't want people getting arrays and objects confused. Might be best to limit that until there is a compelling use case which there might not be.

Heh, whereas despite having written that first bullet in the footgun column somewhat forcefully (looking back), I go the other way. :-)

-- T.J. Crowder

# Andrea Giammarchi (6 years ago)

just a few days ago another full stack JS dev mentioned Array replace and it has nothing to do with what was proposed in here: medium.com/@gajus/the-case-for-array-replace-cd9330707243

My TL;DR response was that once the pipe operator is in, everyone can bring in its own meaning for array |> replace and call it a day.

Keep polluting the already most polluted prototype of them all doesn't look like a good strategy to improve the language.

Just my 2 cents.

# Ben Wiley (6 years ago)

It’s not clear to me that pursuit of new Array methods should be abandoned purely on speculation that the pipe operator will pass Stage 1.

That said, the realization that Object.assign provides this functionality is enough for me to quit pursuing (my version of) Array.prototype.replace.

I’d prefer that further discussion concern the earlier-discussed extension to the Array rest spread syntax. :)

Ben

From: Andrea Giammarchi <andrea.giammarchi at gmail.com>

Date: Tuesday, July 10, 2018 at 10:50 AM To: "T.J. Crowder" <tj.crowder at farsightsoftware.com>

Cc: "therealbenwiley at gmail.com" <therealbenwiley at gmail.com>, "es-discuss at mozilla.org" <es-discuss at mozilla.org>

Subject: Re: Array.prototype.replace

just a few days ago another full stack JS dev mentioned Array replace and it has nothing to do with what was proposed in here: medium.com/@gajus/the-case-for-array-replace-cd9330707243

My TL;DR response was that once the pipe operator is in, everyone can bring in its own meaning for array |> replace and call it a day.

Keep polluting the already most polluted prototype of them all doesn't look like a good strategy to improve the language.

Just my 2 cents.

On Tue, Jul 10, 2018 at 3:37 PM T.J. Crowder <tj.crowder at farsightsoftware.com<mailto:tj.crowder at farsightsoftware.com>> wrote:

On Tue, Jul 10, 2018 at 2:18 PM, Ben Wiley <therealbenwiley at gmail.com<mailto:therealbenwiley at gmail.com>> wrote:

Hm, despite the fewer number of points in the cons category I'm persuaded by the argument that we don't want people getting arrays and objects confused. Might be best to limit that until there is a compelling use case which there might not be.

Heh, whereas despite having written that first bullet in the footgun column somewhat forcefully (looking back), I go the other way. :-)

-- T.J. Crowder

# Isiah Meadows (6 years ago)

I haven't thought of that before, but it's all the more reason to prefer syntax over a new builtin.


Isiah Meadows me at isiahmeadows.com, www.isiahmeadows.com

# Isiah Meadows (6 years ago)

The main things I know of that are blocked on the pipeline operator IIUC are observables and iterable utilities. As-is, using observables without methods or a pipeline operator starts to feel like you're using Lisp, not JS, because of the sheer number of operators. (It's an array over time, not space, so you have things like debouncing, throttling, etc. that you have to address.) Iterables are in a similar situation because they're lazy, it's protocol-based rather than prototype-based, and JS lacks anything like monads.


Isiah Meadows me at isiahmeadows.com, www.isiahmeadows.com

# kai zhu (6 years ago)

The main things I know of that are blocked on the pipeline operator IIUC are observables and iterable utilities.

unlike synchronous languages like python where everything blocks, do we really want iterable utilities for a [one-of-a-kind] async-first language like javascript?

nothing good has ever come from mixing generators with async i/o from my experience. it typically results in hard-to-reason logical-nightmares (and hard-to-fix timeout bugs), that makes web-project integration-work more hellish than it already is (think of how un-debuggable most projects that use koajs-middlewares end up).

# Isiah Meadows (6 years ago)

The iterable utilities would be geared towards sync iterators primarily, which are typically not used for I/O. They would have more in common with Lodash than RxJS.

And JS is not async-first - that's Go, Haskell (with GHC), Clojure (somewhat), and some mostly obscure and/or niche languages, which feature non-blocking I/O with deep language integration (you don't have to code explicit support for it) complete with easy parallelism and syntactic sugar for other things concurrency-related. JS is async-second, like C# and F#, which each feature native non-blocking and (in some cases) optional blocking I/O along with native syntax and/or DSLs with appropriate builtins to support the non-blocking variants, but you still have to code specially for them.

Observables are typically used for input, not output, but they do actually shine well for what they do. They aren't IMHO the ideal abstraction, but they're pretty close for relatively procedural languages. (I prefer duplex streams as a primitive, not observables - they're usually more fault-tolerant, and they're easier to compose and integrate with.)

One last thing: Koa now uses promises, not generators. They only used generators to emulate what async functions provided, and they first made the decision before the feature went stable in V8, even before the feature hit stage 4 in the spec.

# kai zhu (6 years ago)

The iterable utilities would be geared towards sync iterators primarily, which are typically not used for I/O.

can you give common use-cases for javascript sync iterators in an industry-context (e.g. web-programming) that are superior to existing es5 design-patterns? because i honestly cannot think of any. as a former pythonista (for 7 years), i recall using python-iterators primarily for blocking-io (e.g. readline) or algorithms, both of which are niche-applications of javascript in industry (nobody hires javascript-programmers to waste days writing algorithms, when good-enough results can be achieved in hours with sqlite3 or child_process calls to imagemagick/numpy/grep/find/"rm -r"/etc).

the typical scenario for sync iterators that plays out in my head, is that the pm will eventually request a feature requiring unavoidable async io, turning the sync iterator into an async one, which quickly devolves into technical-debt during integration (and will eventually have to be rewritten as a non-iterator for anyone who has the will to do the cleanup).

# Ben Wiley (6 years ago)

I think this discussion has devolved far beyond the latest in the on-topic thread which concerned a syntax proposal for array spreading with replacement.

Did anyone have thoughts on that?

To recap, the array rest spread assignment proposal:

const arr1 = [1,2,3] const arr2 = [...arr1, 1: 4] // [1,4,3]

Additional proposal for destructuring:

const [...arr3, 1: b] = arr2 console.log(arr3) // [1,3] console.log(b) // 4

Potential problems with the destructuring proposal:

const arr4 = [1,2,3,4,5] const [a, ...arr5, 2: b] = arr4 console.log(a, arr5, b) // is this... 1, [2,4,5], 3 ? or... 1, [2,3,5], 4 ?

const [c, ...arr6, 0: d] = arr4 console.log(c, arr6, d) // is this... 1, [2,3,4,5], 1 ? or... 1, [3,4,5], 2 ?

The destructuring bit seems kind of cool but potentially very hard to read, even if those ambiguities are resolved.

I'd be fine pushing ahead with only the first proposal (const arr2 = [...arr1, 1: 4]).

Ben

On 2018-07-12, 2:00 PM, "kai zhu" <kaizhu256 at gmail.com> wrote:

> The iterable utilities would be geared towards *sync* iterators primarily,
> which are typically *not* used for I/O.

can you give common use-cases for javascript *sync* iterators in an
industry-context (e.g. web-programming) that are superior to existing
es5 design-patterns?  because i honestly cannot think of any.  as a
former pythonista (for 7 years), i recall using python-iterators
primarily for blocking-io (e.g. readline) or algorithms, both of which
are niche-applications of javascript in industry (nobody hires
javascript-programmers to waste *days* writing algorithms, when
good-enough results can be achieved in *hours* with sqlite3 or
child_process calls to imagemagick/numpy/grep/find/"rm -r"/etc).

the typical scenario for *sync* iterators that plays out in my head,
is that the pm will eventually request a feature requiring unavoidable
async io, turning the *sync* iterator into an *async* one, which
quickly devolves into technical-debt during integration (and will
eventually have to be rewritten as a non-iterator for anyone who has
the will to do the cleanup).

On 7/12/18, Isiah Meadows <isiahmeadows at gmail.com> wrote:
> The iterable utilities would be geared towards *sync* iterators primarily,
> which are typically *not* used for I/O. They would have more in common with
> Lodash than RxJS.
>
> And JS is not async-first - that's Go, Haskell (with GHC), Clojure
> (somewhat), and some mostly obscure and/or niche languages, which feature
> non-blocking I/O with deep language integration (you don't have to code
> explicit support for it) complete with easy parallelism and syntactic sugar
> for other things concurrency-related. JS is async-second, like C# and F#,
> which each feature native non-blocking and (in some cases) optional
> blocking I/O along with native syntax and/or DSLs with appropriate builtins
> to support the non-blocking variants, but you still have to code specially
> for them.
>
> Observables are typically used for *input*, not *output*, but they do
> actually shine well for what they do. They aren't IMHO the ideal
> abstraction, but they're pretty close for relatively procedural languages.
> (I prefer duplex streams as a primitive, not observables - they're usually
> more fault-tolerant, and they're easier to compose and integrate with.)
>
> One last thing: Koa now uses promises, not generators. They only used
> generators to emulate what async functions provided, and they first made
> the decision before the feature went stable in V8, even before the feature
> hit stage 4 in the spec.
>
> On Wed, Jul 11, 2018, 12:06 kai zhu <kaizhu256 at gmail.com> wrote:
>
>> > The main things I know of that are blocked on the pipeline operator
>> > IIUC
>> > are observables and iterable utilities.
>>
>> unlike synchronous languages like python where everything blocks, do
>> we really want iterable utilities for a [one-of-a-kind] async-first
>> language like javascript?
>>
>> nothing good has ever come from mixing generators with async i/o from
>> my experience.  it typically results in hard-to-reason
>> logical-nightmares (and hard-to-fix timeout bugs), that makes
>> web-project integration-work more hellish than it already is (think of
>> how un-debuggable most projects that use koajs-middlewares end up).
>>
>> -kai
>>
>> On 7/11/18, Isiah Meadows <isiahmeadows at gmail.com> wrote:
>> > The main things I know of that are blocked on the pipeline operator
>> > IIUC
>> > are observables and iterable utilities. As-is, using observables
>> > without
>> > methods or a pipeline operator starts to feel like you're using Lisp,
>> > not
>> > JS, because of the sheer number of operators. (It's an array over
>> > *time*,
>> > not *space*, so you have things like debouncing, throttling, etc. that
>> you
>> > have to address.) Iterables are in a similar situation because they're
>> > lazy, it's protocol-based rather than prototype-based, and JS lacks
>> > anything like monads.
>> >
>> > -----
>> >
>> > Isiah Meadows
>> > me at isiahmeadows.com
>> > www.isiahmeadows.com
>> >
>> > On Tue, Jul 10, 2018 at 11:18 AM, Ben Wiley <therealbenwiley at gmail.com>
>> > wrote:
>> >
>> >> It’s not clear to me that pursuit of new Array methods should be
>> >> abandoned
>> >> purely on speculation that the pipe operator will pass Stage 1.
>> >>
>> >>
>> >>
>> >> That said, the realization that Object.assign provides this
>> functionality
>> >> is enough for me to quit pursuing (my version of)
>> >> Array.prototype.replace.
>> >>
>> >>
>> >>
>> >> I’d prefer that further discussion concern the earlier-discussed
>> >> extension
>> >> to the Array rest spread syntax. :)
>> >>
>> >>
>> >>
>> >> Ben
>> >>
>> >>
>> >>
>> >> *From: *Andrea Giammarchi <andrea.giammarchi at gmail.com>
>> >> *Date: *Tuesday, July 10, 2018 at 10:50 AM
>> >> *To: *"T.J. Crowder" <tj.crowder at farsightsoftware.com>
>> >> *Cc: *"therealbenwiley at gmail.com" <therealbenwiley at gmail.com>, "
>> >> es-discuss at mozilla.org" <es-discuss at mozilla.org>
>> >> *Subject: *Re: Array.prototype.replace
>> >>
>> >>
>> >>
>> >> just a few days ago another full stack JS dev mentioned Array replace
>> and
>> >> it has nothing to do with what was proposed in here:
>> >>
>> >> https://medium.com/@gajus/the-case-for-array-replace-cd9330707243
>> >>
>> >>
>> >>
>> >> My TL;DR response was that once the pipe operator is in, everyone can
>> >> bring in its own meaning for `array |> replace` and call it a day.
>> >>
>> >>
>> >>
>> >> Keep polluting the already most polluted prototype of them all doesn't
>> >> look like a good strategy to improve the language.
>> >>
>> >>
>> >>
>> >> Just my 2 cents.
>> >>
>> >>
>> >>
>> >>
>> >>
>> >>
>> >>
>> >> On Tue, Jul 10, 2018 at 3:37 PM T.J. Crowder <
>> >> tj.crowder at farsightsoftware.com> wrote:
>> >>
>> >> On Tue, Jul 10, 2018 at 2:18 PM, Ben Wiley <therealbenwiley at gmail.com>
>> >> wrote:
>> >> > Hm, despite the fewer number of points in the cons category I'm
>> >> persuaded by
>> >> > the argument that we don't want people getting arrays and objects
>> >> confused.
>> >> > Might be best to limit that until there is a compelling use case
>> >> > which
>> >> there
>> >> > might not be.
>> >>
>> >> Heh, whereas despite having written that first bullet in the footgun
>> >> column somewhat forcefully (looking back), I go the other way. :-)
>> >>
>> >>
>> >>
>> >> -- T.J. Crowder
>> >>
>> >> _______________________________________________
>> >> es-discuss mailing list
>> >> es-discuss at mozilla.org
>> >> https://mail.mozilla.org/listinfo/es-discuss
>> >>
>> >>
>> >> _______________________________________________
>> >> es-discuss mailing list
>> >> es-discuss at mozilla.org
>> >> https://mail.mozilla.org/listinfo/es-discuss
>> >>
>> >>
>> >
>>
>
# Isiah Meadows (6 years ago)

I'm the one who suggested syntax over a method. If we go for support for destructuring, too, I feel this is probably the ideal behavior:

  • [3: c, a, b, ...rest] = foo should be illegal: positional properties must come before indexed properties.
  • [a, b, ...rest, 3: c] = foo should result in c being the fourth result of the iterable (offset 3), but ...rest also contains it as its second entry.

But I'm not sure destructuring support is even a good idea to add: {3: c} = foo already works, and I've yet to come across a scenario where the positional access is even useful apart from off the end (which is solved by allowing entries after the spread).


Isiah Meadows me at isiahmeadows.com, www.isiahmeadows.com