Tom Boutell (2019-09-07T15:16:53.000Z)
*REVISED PROPOSAL (thanks for the input so far!)*

*Background*

Developers learning async programming in the modern async/await era
frequently discover this useful pattern:

```js
for (item of items) {
  await db.insert(item);
  // additional awaited operations, etc.
}
```

This pattern extends the readability of "await" to cases where each item in
an array, or each item in an iterable, requires some asynchronous
processing. It also ensures that items are processed one at a time,
avoiding unbounded concurrency that results in excessive stress on back
ends, triggers API rate limiting, and otherwise results in unpredictable
bugs.

However, a common next step is to wish for a manageable level of
concurrency. For instance, processing 500 asynchronous calls at once is
unwise, especially in a web application that is already handling 100
requests at once. But, processing 5 at once may be reasonable and improve
the processing time per request.

Unfortunately, at present, this requires shifting mental gears from
async/await to promises. Here is an example based on Bluebird, there are of
course other libraries for this:

```js
const Promise = require('bluebird');
await Promise.map(items, async function(item) {
  await db.insert(item);
  // additional awaited operations, etc.
}, *{ concurrency: 5 }*);
```

While effective, this is a lot of boilerplate and a shift of mental model.
And in my experience as a mentor to many developers, this is *the only
situation in which they frequently need to reach for a promise
library. *A language
feature to naturally integrate it with async/await would substantially
expand the effectiveness of async/await as a tool for reducing the
cognitive load of asynchronous programming.

*Proposed Feature*

I propose extending the existing async / await syntax to accommodate
*specifying
the concurrency of a for...of loop:*

```js
for (item of items *concurrency 5*) {
  // db.insert is an async function
  await db.insert(item);
}
console.log('Processing Complete');
```

The major benefit here is that *the developer does not have to shift gears
mentally from async/await to thinking about promises to deal with a common
case in systems programming. *They are able to continue with the pattern
they are already comfortable with.

Up to 5 loop bodies commence concurrently with regard to "await" statements
(see below).

There is no guarantee that item 3 will finish before item 2, or that item 4
won't start (due to 3 being finished) before item 2 ends, etc.

If an exception is not caught inside the loop body, the loop stops, that
exception is thrown beyond the loop, and any further exceptions from that
invocation of the loop due to concurrently executing loop bodies are
discarded.

Just as with an ordinary "for...of" loop containing an "await" statement, *it
is guaranteed that, barring an exception, the loop body will execute
completely for every item in "items" before the loop exits and the
console.log statement executes.* The only difference is that *the
specified amount of concurrency is permitted during the loop*.

"5" may be any expression resulting in a number. It is cast to an integer.
If  the result is not a natural number, an error is thrown (it must not be
0 or a negative number).

*FAQs*

*"What if I want unlimited concurrency?"*

It is rarely a good idea. It results in excessive stress on back ends,
unnecessary guards that force serialization in interface libraries just to
cope with (and effectively negate) it, and API rate limiting. This feature
teaches the best practice that the level of concurrency should be mindfully
chosen. However, those who really want it can specify "concurrency
items.length" or similar.

*"What about async iterators?"*

The feature should also be supported here:

```js
for *async* (item of items *concurrency 5*) {
  // db.insert is an async function
  await db.insert(item);
}
```

While the async iterator itself is still sequential rather than concurrent,
frequently these can supply values considerably faster than they are
processed by the loop body, and so there is still potential benefit in
having several items "in the hopper" (up to the concurrency limit) at a
time.

-- 
Chief Software Architect
Apostrophe Technologies
Pronouns: he / him / his
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/es-discuss/attachments/20190907/41c08612/attachment-0001.html>
tom at apostrophecms.com (2019-09-07T15:20:27.291Z)
*REVISED PROPOSAL (thanks for the input so far!)*

*Background*

Developers learning async programming in the modern async/await era
frequently discover this useful pattern:

```js
for (item of items) {
  await db.insert(item);
  // additional awaited operations, etc.
}
```

This pattern extends the readability of "await" to cases where each item in
an array, or each item in an iterable, requires some asynchronous
processing. It also ensures that items are processed one at a time,
avoiding unbounded concurrency that results in excessive stress on back
ends, triggers API rate limiting, and otherwise results in unpredictable
bugs.

However, a common next step is to wish for a manageable level of
concurrency. For instance, processing 500 asynchronous calls at once is
unwise, especially in a web application that is already handling 100
requests at once. But, processing 5 at once may be reasonable and improve
the processing time per request.

Unfortunately, at present, this requires shifting mental gears from
async/await to promises. Here is an example based on Bluebird, there are of
course other libraries for this:

```js
const Promise = require('bluebird');
await Promise.map(items, async function(item) {
  await db.insert(item);
  // additional awaited operations, etc.
}, { concurrency: 5 });
```

While effective, this is a lot of boilerplate and a shift of mental model.
And in my experience as a mentor to many developers, this is *the only
situation in which they frequently need to reach for a promise
library. *A language
feature to naturally integrate it with async/await would substantially
expand the effectiveness of async/await as a tool for reducing the
cognitive load of asynchronous programming.

*Proposed Feature*

I propose extending the existing async / await syntax to accommodate
*specifying
the concurrency of a for...of loop:*

```js
for (item of items concurrency 5) {
  // db.insert is an async function
  await db.insert(item);
}
console.log('Processing Complete');
```

The major benefit here is that *the developer does not have to shift gears
mentally from async/await to thinking about promises to deal with a common
case in systems programming. *They are able to continue with the pattern
they are already comfortable with.

Up to 5 loop bodies commence concurrently with regard to "await" statements
(see below).

There is no guarantee that item 3 will finish before item 2, or that item 4
won't start (due to 3 being finished) before item 2 ends, etc.

If an exception is not caught inside the loop body, the loop stops, that
exception is thrown beyond the loop, and any further exceptions from that
invocation of the loop due to concurrently executing loop bodies are
discarded.

Just as with an ordinary "for...of" loop containing an "await" statement, *it
is guaranteed that, barring an exception, the loop body will execute
completely for every item in "items" before the loop exits and the
console.log statement executes.* The only difference is that *the
specified amount of concurrency is permitted during the loop*.

"5" may be any expression resulting in a number. It is cast to an integer.
If  the result is not a natural number, an error is thrown (it must not be
0 or a negative number).

*FAQs*

*"What if I want unlimited concurrency?"*

It is rarely a good idea. It results in excessive stress on back ends,
unnecessary guards that force serialization in interface libraries just to
cope with (and effectively negate) it, and API rate limiting. This feature
teaches the best practice that the level of concurrency should be mindfully
chosen. However, those who really want it can specify "concurrency
items.length" or similar.

*"What about async iterators?"*

The feature should also be supported here:

```js
for async (item of items concurrency 5) {
  // db.insert is an async function
  await db.insert(item);
}
```

While the async iterator itself is still sequential rather than concurrent,
frequently these can supply values considerably faster than they are
processed by the loop body, and so there is still potential benefit in
having several items "in the hopper" (up to the concurrency limit) at a
time.
tom at apostrophecms.com (2019-09-07T15:19:05.024Z)
*REVISED PROPOSAL (thanks for the input so far!)*

*Background*

Developers learning async programming in the modern async/await era
frequently discover this useful pattern:

```js
for (item of items) {
  await db.insert(item);
  // additional awaited operations, etc.
}
```

This pattern extends the readability of "await" to cases where each item in
an array, or each item in an iterable, requires some asynchronous
processing. It also ensures that items are processed one at a time,
avoiding unbounded concurrency that results in excessive stress on back
ends, triggers API rate limiting, and otherwise results in unpredictable
bugs.

However, a common next step is to wish for a manageable level of
concurrency. For instance, processing 500 asynchronous calls at once is
unwise, especially in a web application that is already handling 100
requests at once. But, processing 5 at once may be reasonable and improve
the processing time per request.

Unfortunately, at present, this requires shifting mental gears from
async/await to promises. Here is an example based on Bluebird, there are of
course other libraries for this:

```js
const Promise = require('bluebird');
await Promise.map(items, async function(item) {
  await db.insert(item);
  // additional awaited operations, etc.
}, { concurrency: 5 });
```

While effective, this is a lot of boilerplate and a shift of mental model.
And in my experience as a mentor to many developers, this is *the only
situation in which they frequently need to reach for a promise
library. *A language
feature to naturally integrate it with async/await would substantially
expand the effectiveness of async/await as a tool for reducing the
cognitive load of asynchronous programming.

*Proposed Feature*

I propose extending the existing async / await syntax to accommodate
*specifying
the concurrency of a for...of loop:*

```js
for (item of items concurrency 5) {
  // db.insert is an async function
  await db.insert(item);
}
console.log('Processing Complete');
```

The major benefit here is that *the developer does not have to shift gears
mentally from async/await to thinking about promises to deal with a common
case in systems programming. *They are able to continue with the pattern
they are already comfortable with.

Up to 5 loop bodies commence concurrently with regard to "await" statements
(see below).

There is no guarantee that item 3 will finish before item 2, or that item 4
won't start (due to 3 being finished) before item 2 ends, etc.

If an exception is not caught inside the loop body, the loop stops, that
exception is thrown beyond the loop, and any further exceptions from that
invocation of the loop due to concurrently executing loop bodies are
discarded.

Just as with an ordinary "for...of" loop containing an "await" statement, *it
is guaranteed that, barring an exception, the loop body will execute
completely for every item in "items" before the loop exits and the
console.log statement executes.* The only difference is that *the
specified amount of concurrency is permitted during the loop*.

"5" may be any expression resulting in a number. It is cast to an integer.
If  the result is not a natural number, an error is thrown (it must not be
0 or a negative number).

*FAQs*

*"What if I want unlimited concurrency?"*

It is rarely a good idea. It results in excessive stress on back ends,
unnecessary guards that force serialization in interface libraries just to
cope with (and effectively negate) it, and API rate limiting. This feature
teaches the best practice that the level of concurrency should be mindfully
chosen. However, those who really want it can specify "concurrency
items.length" or similar.

*"What about async iterators?"*

The feature should also be supported here:

```js
for *async* (item of items concurrency 5) {
  // db.insert is an async function
  await db.insert(item);
}
```

While the async iterator itself is still sequential rather than concurrent,
frequently these can supply values considerably faster than they are
processed by the loop body, and so there is still potential benefit in
having several items "in the hopper" (up to the concurrency limit) at a
time.