ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

# Roma Bronstein (5 years ago)

It's my first time suggesting a feature, hope I'm doing it correctly.

I really like using Array.prototype.map(), Array.prototype.reduce() and all related functions. The code is more readable and it looks better. However, when I want to write performance sensitive code, chaining these functions is not a good approach. For example, writing this: // a is an Array of length N const b = a.filter().map()

will require 2 traversals over the whole array, up to 2*N iterations (if the filter passes all items).

This is why I often resort to writing this: const b= [] a.forEach(() => { if (/the filter condition/) b.push(/mapping logic/) })

Which requires only N iterations.

I suggest adding a capability to streamline items to these functions. I get my inspiration from Redis's transaction syntax where you declare starting a transaction and finally call EXEC in order to execute it. So now I'll be able to write something like this: const b = a.stream() .filter() .map() .exec()

Just to clarify the example: I've declared that I'd like to stream array items of a. Then I've chained the functions I'd like to items to pass through. Finally I've activated it using the exec() function.

I'm not sure if this is the best syntactical approach, but this example is more intuitive to understand in my opinion.

Another approach could be thinking about a "pipeline" operator like in UNIX cli, providing a more generic capability to pipeline iterators.

Again, I hope I'm doing this correctly and in the right forum. And if so, I'd be happy to hear some feedback.

Thanks, Roma

# Oliver Dunk (5 years ago)

This seems like a good place to share the idea, and it’s helpful that you provided use cases etc.

Is there a reason why you prefer the proposed syntax over the forEach loop you mentioned? Personally I like how the forEach is easy to understand, but maybe there are other examples of when the stream is useful and the polyfill is much more complex.

# Roma Bronstein (5 years ago)

Thanks Oliver for the quick response.

The problem for me with forEach, that it's pretty much like a for loop. Anything can go inside the iteration logic. However with filter/map/reduce/some/every functions, the intention is explicit. Also you don't need to implement basic array operations that the above mentioned functions already provide.

For instance in my opinion writing something like: a.filter() .map() .reduce()

Is much clearer and safer than:

let reducedValue a.forEach(item => { if(!<filter condition>) return const mappedItem = mappingLogic(item) reducedValue = reduceLogic(mappedItem) })

# Bergi (5 years ago)

However, when I want to write performance sensitive code, chaining these functions is not a good approach. const b = a.filter().map()

will require 2 traversals over the whole array, up to 2*N iterations (if the filter passes all items).

Actually, the number of passes hardly matters. It's still linear complexity. What makes this slow is the allocation of the unnecessary temporary array.

I suggest adding a capability to streamline items to these functions.

We don't need streams, JavaScript already has iterators. What we do need are proper helper functions for those - see the existing proposal at tc39/proposal-iterator-helpers. You then can write

 const b = Array.from(a.values().filter(…).map(…))

or

 for (const x of a.values().filter(…).map(…))
     console.log(x);

kind , Bergi

# Roma Bronstein (5 years ago)

Thanks for the reply Bergi.

Correct me if I'm wrong but writing something like: const b = Array.from(a.values().filter(…).map(…))

Still requires 2 iterations over the array. You're theoretically right that the running time complexity is still linear. However with the introduction of the ability to have async functions within each iteration, in real life production code there's a significant difference between running 1M iterations vs. ~2M (an iteration can involve interaction with a database or 3rd party API).

I'd be happy to understand more from Gus, how would his proposal deal with the scenario I've described.

Thanks, Roma