Proposal: `await.all {...}` for parallelism
If we have await.all
, what about await.race
, await.allSettled
,
await.any
?
Typically I find I want to loop over an iterator of items and apply a function body of work on them in parallel.
So it would be nice to support full blocks of statements that can do this work in parallel, instead of relying on just expressions or functions to achieve this.
Extending for loops to have a parallel form is another option (I seem to recall something similar brought up before here):
for await.all (const entry of entries) {
await doWork(entry);
}
Whether async iteration should be supported or treated as sync or parallel is another question though, and possibly a confusion of the form.
Regarding the other Promise
methods, this syntax could certainly
extend to all of them (that's part of the reason I chose this syntax,
to leave that option open).
await.any
and await.race
would work analogously to await.all
,
but since we're no longer dealing with return values there's no good
way to get the value that won the race. This is fine as long as you're
not dependent on that value and only care about executing some code as
soon as the race is won. The best way I can think of to get the
winning value is something like this:
async function myFunc() {
let value1, value2;
await.any {
value1 = await foo;
value2 = await bar;
}
const firstResult = value1 || value2; // could also use ??
}
...which isn't any easier than using Promise.any
.
await.allSettled
would, in many cases, be the same as Promise.all
except it would also swallow errors. I'd have to think more about its
use cases but it could be implemented the same way.
Regarding parallel for-loops: I'd consider that a separate problem.
You'd want parallel for-loops for things like requests to the same
API, where each promise is handled the same way. await.all
is more
for handling disparate tasks in parallel without having to resort to
thinking in terms of arrays and iteration.
There's for await
loops since recently, there could be await for
loops
for wrapping the whole execution of a loop
console.time(1);
await for (const fn of [()=>delay(50).then(()=>'a'),
()=>delay(80).then(()=>'b')]) {
console.timeLog(1, await fn());
}
console.timeEnd(1);
This would log the same than:
console.time(1);
await Promise.all(
[()=>delay(50).then(()=>'a'), ()=>delay(80).then(()=>'b')]
.map(async fn => { console.timeLog(1, await fn()) })
);
console.timeEnd(1);
/*
1: 51.066162109375ms a
1: 80.998291015625ms b
1: 81.315185546875ms
*/
without await for
, things are serial:
console.time(1);
for (const fn of [()=>delay(50).then(()=>'a'),
()=>delay(80).then(()=>'b')]) {
console.timeLog(1, await fn());
}
console.timeEnd(1);
/*
1: 50.68212890625ms a
1: 130.9951171875ms b
1: 131.1162109375ms
*/
(var delay = t => new Promise(r => setTimeout(r, t));
)
I don't like the idea of await behaving differently inside vs outside of the "await.all" block, and I think is a source of bugs:
await.all { const x = await doSomethingAsync(); //x is still undefined here! Not the case outside of an await.all block }
Maybe if you drop the "await" in your example:
await.all { const x = doSomethingAsync(); //x is just the promise here, but at least is the same whether inside or outside of the await.all block }
...but that still waits for the async functions to complete, I think it would cause fewer bugs and would seem to still satisfy the motivation?
Hello!
This [current] structure is also just fundamentally different from working serially in async/await and it forces you to reason about the problem in a specific way. This doesn't appear to be a conscious decision to force good code practices
Actually I'd argue that it is. Doing stuff concurrently is fundamentally different from doing it serially, and should be reasoned about every time you use it.
kind , Bergi
Maybe if you drop the "await" in your example:
await.all { const x = doSomethingAsync(); //x is just the promise here }
...but that still waits for the async functions to complete, I think it would cause fewer bugs and would seem to still satisfy the motivation?
It doesn't seem like the await.all
block is doing anything in that
case. That code seems equivalent to this:
const x = doSomethingAsync();
myFunction(await x)
await.all {
const x = await doSomethingAsync();
//x is still undefined here!
}
You bring up a good point about scoping and race conditions. It's a little tricky since the curly braces create a block scope but none of the parallel statements should be allowed to access each-other's variables, it's almost like each statement should have its own scope. Maybe it'd be better to have a syntax that ensures a set of curly braces for each parallel task? Async do-expressions could be a good solution (assuming they'd work kind of like an async IIFE):
async function initialize() {
let foo, bar, baz;
await Promise.all([
async do { foo = (await request('foo.json')).data },
async do { bar = (await request('bar.json')).data },
async do { baz = (await request('baz.json')).data },
]);
render(foo, bar, baz);
}
(this is also a less drastic syntax change that piggybacks on an existing proposal)
...strike that, I misread the "but that still waits for the async functions to complete" part. So what you're proposing is that everything functions normally inside the curly braces, but execution doesn't continue until all promises have resolved? So your example would work essentially like this:
const x = doSomethingAsync();
const y = doSomethingElseAsync();
await x, await y;
// all promises are resolved by now, but
// still need to use await to unbox the values
someFunction(await x, await y);
Just FYI, I previously suggested a couple things substantially more flexible than this 1, 2 (originated from this 3), and it mostly fell flat due to being highly premature. Anything exclusive to promises is unlikely to win as library methods exist for basically all use cases and from my experience, committee members are in general very hesitant to add syntax for anything that doesn't pay for itself well. Similar questions have come up a few times in the past, too, and I've commented on two of them. 4, 5
If anything, I don't feel we know the problem space well enough, and the language lacks the primitives needed to really dig into it. (This is why I came up with my generator forking strawman. 6)
Isiah Meadows contact at isiahmeadows.com, www.isiahmeadows.com
why not just await
as already is, but supporting an iterable / array of
promises, as Promise.all
already does, automatically discerning single
promises vs multiple ones:
const promises = [...]
// in parallel (`await` automatically acts as `Promise.all` here)
const results = await promises
results.forEach(result => ...)
Just FYI, I previously suggested a couple things substantially more flexible than this
Ah, thank you for bringing those proposals to my attention. I looked through the archives for relevant discussions but I must've missed them.
It seems like we converged on a similar syntax for what you called "merging," and the idea that there ought to be a separate syntax for iteration. I don't know whether that means that this is the right solution or just the most obvious one, but either way it's encouraging to know that other people have the same difficulties with the current syntax and are thinking about the problem.
from my experience, committee members are in general very hesitant to add syntax for anything that doesn't pay for itself well
Yeah, I figured the bar would be high for new syntax. I've run into the awkwardness of dealing with distinct parallel tasks several times, and a few of the people I discussed it with were in the same boat, so I wrote up this proposal thinking it might have a wide appeal. The proposed syntax desugars via a relatively simple transformation but encourages developers to reason about the problem in a completely different way that I'd argue is more intuitive. Whether the committee agrees and thinks it justifies a new syntax remains to be seen, but either way I'm excited to see where this discussion goes (whether it leads to the proposed syntax, to some other syntax, or to somewhere else entirely).
As a side note: thank you to everyone for the thoughtful questions and responses, I had no idea what to expect from this thread and it's gone better than I could've hoped for. Thank you for not immediately shooting down a proposal that looks similar to other proposals before it.
why not just
await
as already is, but supporting an iterable / array of promises, asPromise.all
already does
await
can already accept a non-promise, so I believe that'd be
breaking syntax if Array.prototype.then
is set. It also requires
collecting the promises in an array, which is what the proposed syntax
is trying to avoid.
AFAIK await
can only accept an expression
as a Promise
, not other
thing:
developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/await
I have a solution for that:
const promises = [...] await.all promises //returns an array of results await.race promises //returns a single result
etc.
Why not just use a combination of async
/await
and .then
?
async function initialize() {
const [foo, bar, baz] = await Promise.all([
request('foo.json').then(t => t.data),
request('bar.json').then(t => t.data),
request('baz.json').then(t => t.data),
]);
render(foo, bar, baz);
}
const promises = [...]
await.all promises //returns an array of results
await.race promises //returns a single result
One of the goals of this proposal is to simplify the process of collecting the promises in an array and then having to get them out of another array.
Why not just use a combination of
async
/await
and.then
?
That's an okay solution, though we still end up having to maintain "parallel lists" of promises and values, which I'm hoping to avoid. But you can sidestep them with something like this, which isn't too bad:
async function initialize() {
let foo, bar, baz;
await Promise.all([
request('foo.json').then(t => foo = t.data),
request('bar.json').then(t => bar = t.data),
request('baz.json').then(t => baz = t.data),
]);
render(foo, bar, baz);
}
I have a solution for that:
const promises = [...] await.all promises //returns an array of results await.race promises //returns a single result
well, my proposal is exactly that, but doing await.all
by default with
just await
.
Yes of course, I was responding to your proposal and the subsequent email about it being incompatible with existing JavaScript because "await" on its own accepts non-promises, so wouldn't return an array of results from an array of promises, hence why I proposed await.all etc.
This [current] structure is also just fundamentally different from working serially in async/await and it forces you to reason about the problem in a specific way. This doesn't appear to be a conscious decision to force good code practices
Actually I'd argue that it is. Doing stuff concurrently is fundamentally different from doing it serially, and should be reasoned about every time you use it.
I agree that parallelism is different and should be handled with care,
but I don't think it follows that the best way to reason about
parallelism is the way that Promise.all
encourages. Making something
more complicated doesn't necessarily mean you'll do a better job of
reasoning about it.
If you think the proposed syntax encourages poorly-reasoned-about code, I'm open to iterating on it to find a syntax that works with the developer to handle parallelism in a safe way, and also doesn't require them to write too much boilerplate code.
I do find the pattern of promise "all" combined with destructuring the easiest way to handle parallelism. I think it's the only "deterministic" parallel pattern code wise.
I think non determinism in code increases the probability of bugs.
I am very sympathetic to pitches to allow more common cases for promise libraries to be written in an "awaitful" syntax without thinking explicitly about promises.
Howeever I think that changing the meaning of the semicolon in a particular context has too much potential for confusion. As others have said, parallel execution is different, and it should look and feel different. The most basic assumption a developer makes (consecutive lines of code run consecutively) is difficult to get away from; that's why we introduced "await" in the first place, to bring back the ability to write deterministic code with consecutive statements. Which sounds like a reasonable ask, when it's put that way. (:
I did propose this recently:
for (const item of items concurrency 5) { await doTheThing(item); }
However in this case I'm not talking about consecutive statements, I'm only talking about rules for simultaneously (in the sense of async, not threads) running more than one instance of the block. So I'm not proposing that we change the meaning of the semicolon(s) within the block in a way that could mean that if you're looking at half the code in the middle you would be likely to fundamentally misunderstand its operation.
I think that risk - that you can't tell what a semicolon means without reference to the outer context - is what makes your proposal a bridge too far for me.
However, if await.all { ... }
were to mean "wait for all non-awaited
async function calls made within this block to complete before proceeding",
as I suggested earlier, I think that could satisfy determinism for "await"
wherever it is used, and satisfy the original motivation:
await.all {
for (const item of items) {
doTheThingAsync(item);
}
}
Notice I have omitted await
inside the loop. Like current JavaScript,
that causes parallel execution, so no change on that front, from a
determinism perspective. So determinism is not hurt by await.all
. Rather,
it guarantees completion before going further.
In an earlier example (paraphrase-coded as I forgot the names):
let x, y;
await.all {
x = getXAsync();
y = getYAsync();
}
processXAndY(x, y);
I think the benefit of this syntax appears more stark with the looped
(first) example, as current JavaScript requires building an array in the
loop to subsequently pass to Promise.all
, which I think is a little more
difficult to conceptualize than the await.all { ... }
way of doing it.
The 2nd example is arguably better than current JavaScript too,
particularly because the coder doesn't have to be very smart with
destructuring in light of understanding the "Promise.all" return type, etc.
In other words, less cognitive overhead, which I think is a net positive.
This is very interesting, but this code:
await.all { x = getXAsync(); y = getYAsync(); }
processXAndY(x, y);
Still carries within it the problem that if I'm looking at just the middle of the { ... } block — if "await.all" has scrolled offscreen — I'll be completely wrong about what ";" means. I think that's too much magic.
Also, in the case of the "for" loop, this doesn't address managing the level of concurrency. Although it could in theory with a syntax like await.all({ concurrency: 5 }), I'm not sure if it's practical to implement that for your general case.
Actually I'm curious about what the implementation would look like in general. If it were babel compiling this, I guess it would have to wrap every statement not preceded by "await" with a check for whether it returns a thenable and add it to an array if it does. But with the concurrency feature it would also have to defer executing the code at all until the right time as otherwise we're still starting zillions of "processes" at once.
It does not change the meaning of the ";" at all. As you may already
know, omitting await
already invokes multiple async function calls in
parallel, in current JavaScript, so absolutely no change in that respect.
The only thing this await.all
suggestion does, is ensure that all
non-awaited async function calls are completed before proceeding beyond the
end of the block.
i.e. it adds fairly straightforward and terse deterministic control to
otherwise non-deterministic code, without requiring knowledge of
destructuring or Promise.all
.
Hey, you're absolutely right! It's OK because it just means things are more deterministic before the block exits. It doesn't impact any reasonable expectations during the block.
I am convinced that your syntax is useful and does not introduce any new confusion.
I wonder, then, if it is also possible to implement concurrency limits for this properly?
await.all({ concurrency: 5 }) { for (const item of items) { // returns promise item.process(); } }
This is more challenging because, in our transpilation, we can't just bottle up all the promises and call Promise.all at the end. It would be too late to manage how many are in process at once, bashing on various API limits (:
There is however a performance concern with your code that we should talk about.
If I write this:
await.all { returnsAPromise(); for (let i = 0; (i < 100000); i++) { doesSimpleFastSynchronousMath(); } returnsAnotherPromise(); }
Then Babel will have no choice but to compile this to:
{ const promises = []; { const maybeThenable = returnsAPromise(); if (maybeThenable && maybeThenable.then) { promises.push(maybeThenable); } } promises.push(returnsAPromise()); for (let i = 0; (i < 100000); i++) { const maybeThenable = doesSimpleFastSynchronousMath(); if (maybeThenable && maybeThenable.then) { promises.push(maybeThenable); } } const maybeThenable = returnsAnotherPromise(); if (maybeThenable && maybeThenable.then) { promises.push(maybeThenable); } } await Promise.all(promises);
Which could have a significant performance impact on that synchronous inner loop.
why not making it work with the addition of a new keyword suffix for
parallel awaits (for example, ||
), grouping the mid blocks that require
parallelism. playing with your example and going a bit further:
async {
const v0 = await|| returnsAPromise(); // to be grouped in parallel
for (let i = 0; (i < 100000); i++) {
doesSimpleFastSynchronousMath();
}
const v1 = await|| returnsAnotherPromise(); // to be grouped in parallel
async {
await returnsAnotherPromise1();
const v2 = await|| returnsAnotherPromise2(); // to be grouped in
parallel
const v3 = await|| returnsAnotherPromise3(); // to be grouped in
parallel
await returnsAnotherPromise4(v2, v3);
const v4 = await returnsAnotherPromise5();
}
await returnsAnotherPromiseX(v0, v1);
}
another example, that may "normalize" it a bit more:
async { // ... = returns a promise
const x1 = await|| ...
const x2 = await ... (x1)
const x3 = await|| ...
const x10 = await ... (x2, x3)
let x4, x5, x6
async {
x4 = await|| ... (x1, x2)
x5 = await|| ... (x2, x3)
x6 = await ... (x4, x5, x10)
}
let x7, x8, x9
async {
x7 = await|| ... (x4, x6)
x8 = await ... (x6, x7)
x9 = await|| ... (x5, x6)
}
await ... (x8, x9)
}
Would you mind clarifying the performance hit you are referring to? I'm seeing that the "synchronous" calls wouldn't be added to the array you used in your example, so it's not clear to me to which performance hit you are referring.
Hi Manuel! Would you mind explaining the added value of your proposal? I am only seeing it being more verbose, but I've not found any added functionality or benefit from it
Hi Manuel! Would you mind explaining the added value of your proposal? I am only seeing it being more verbose, but I've not found any added functionality or benefit from it
the proposal combines parallel and series async / await and could resolve complex trees, like the following example:
// NOTE async {} = asynchronous scope (it groups parallel awaits => `await||`)
// NOTE p? = function call that returns a promise (? = just and index)
async {
const x1 = await|| p1()
const x2 = await p2(x1)
const x3 = await|| p3()
const x10 = await p10(x2, x3)
let x4, x5, x6
async {
x4 = await|| p4(x1, x2)
x5 = await|| p5(x2, x3)
x6 = await p6(x4, x5, x10)
}
let x7, x8, x9
async {
x7 = await|| p7(x4, x6)
x8 = await p8(x6, x7)
x9 = await|| p9(x5, x6)
}
await p11(x8, x9)
}
// it would resolve a tree of parallel and series like following with
traditional promises
Promise.resolve()
.then(() => Promise.all([p1, p3]))
.then((x1, x3) =>
p2(x1)
.then(x2 =>
p10(x2, x3)
.then(x10 => {
let x4, x5, x6
return Promise.resolve()
.then(() => Promise.all([p4(x1, x2), p5(x2, x3
)]))
.then(results => [x4, x5] = results)
.then(() => p6(x4, x5, x10))
.then(x6 => {
let x7, x8, x9
return Promise.resolve()
.then(() => Promise.all([p7(x4, x6), p9(
x5, x6)]))
.then(results => [x7, x9] = results)
.then(() => p8(x6, x7))
.then(_x8 => x8 = _x8)
.then(() => p11(x8, x9))
})
})
)
)
a few little corrections...
// NOTE async {} = asynchronous scope (it groups parallel awaits => await||
)
// NOTE p? = function call that returns a promise (? = just and index)
async { const x1 = await || p1() const x2 = await p2(x1) const x3 = await || p3() const x10 = await p10(x2, x3)
let x4, x5, x6
async {
x4 = await || p4(x1, x2)
x5 = await || p5(x2, x3)
x6 = await p6(x4, x5, x10)
}
let x7, x8, x9
async {
x7 = await || p7(x4, x6)
x8 = await p8(x6, x7)
x9 = await || p9(x5, x6)
}
await p11(x8, x9)
}
Promise.resolve() .then(() => Promise.all([p1(), p3()])) .then((x1, x3) => p2(x1) .then(x2 => p10(x2, x3) .then(x10 => { let x4, x5, x6
return Promise.resolve()
.then(() => Promise.all([p4(x1, x2), p5(x2, x3
)])) .then(results => [x4, x5] = results) .then(() => p6(x4, x5, x10)) .then(_x6 => x6 = _x6) .then(() => { let x7, x8, x9
return Promise.resolve()
.then(() => Promise.all([p7(x4, x6), p9(
x5, x6)])) .then(results => [x7, x9] = results) .then(() => p8(x6, x7)) .then(_x8 => x8 = _x8) .then(() => p11(x8, x9)) }) }) ) )
yet smaller, as Promise.resolve() was totally redundant here...
// NOTE async {} = asynchronous scope (it groups parallel awaits => await||
)
// NOTE p? = function call that returns a promise (? = just and index)
async { const x1 = await || p1() const x2 = await p2(x1) const x3 = await || p3() const x10 = await p10(x2, x3)
let x4, x5, x6
async {
x4 = await || p4(x1, x2)
x5 = await || p5(x2, x3)
x6 = await p6(x4, x5, x10)
}
let x7, x8, x9
async {
x7 = await || p7(x4, x6)
x8 = await p8(x6, x7)
x9 = await || p9(x5, x6)
}
await p11(x8, x9)
}
Promise.all([p1(), p3()]) .then((x1, x3) => p2(x1) .then(x2 => p10(x2, x3) .then(x10 => { let x4, x5, x6
return Promise.all([p4(x1, x2), p5(x2, x3)])
.then(results => [x4, x5] = results)
.then(() => p6(x4, x5, x10))
.then(_x6 => x6 = _x6)
.then(() => {
let x7, x8, x9
return Promise.all([p7(x4, x6), p9(x5, x6)])
.then(results => [x7, x9] = results)
.then(() => p8(x6, x7))
.then(_x8 => x8 = _x8)
.then(() => p11(x8, x9))
})
})
)
)
I don't think I like "await ||" as a syntax, because it doesn't have anything to do with the "OR" operator.
It does avoid adding a bunch of potentially optimization-breaking if statements to the transpiled output for synchronous code like in my example, though, because you only get the behavior for the promises you actually choose to collect with it.
Manuel, I am not sure I understand your examples. You are consuming the values x1 and x2 in p4 right in the middle of the same async block that contains the "await ||" statements that produce them. They are not guaranteed to resolve until after that block is over, no?
I don't think I like "await ||" as a syntax, because it doesn't have anything to do with the "OR" operator.
it is just a quick proposal to symbolise parallelism easily (||), but just that, of course it could be any other more convenient symbol
Manuel, I am not sure I understand your examples. You are consuming the values x1 and x2 in p4 right in the middle of the same async block that contains the "await ||" statements that produce them. They are not guaranteed to resolve until after that block is over, no?
the async {}
block would group only the parallel awaits (await||
) into
a Promise.all
and would chain it in order with the other series awaits
inside (normal await
).
So the parallel awaits would execute before the series awaits?
not exactly. it would group the parallel awaits at the position the first parallel await|| is sentenced.
following the demo i sent:
// NOTE async {} = asynchronous scope (it groups parallel awaits => await||
)
// NOTE p? = function call that returns a promise (? = just and index)
async { const x1 = await || p1() const x2 = await p2(x1) const x3 = await || p3() const x10 = await p10(x2, x3)
let x4, x5, x6
async {
x4 = await || p4(x1, x2)
x5 = await || p5(x2, x3)
x6 = await p6(x4, x5, x10)
}
let x7, x8, x9
async {
x7 = await || p7(x4, x6)
x8 = await p8(x6, x7)
x9 = await || p9(x5, x6)
}
await p11(x8, x9)
}
in the outer block, the first sentence is already parallel, ok, then all parallel awaits in this block will group at first position, siblings of this first one (that is p1 and p3).
in the inner blocks too (just a casualty i created the demos this way, but could be different and still work). in the first inner block, p4 and p5, and in the second inner block, p7 and p9.
"transpiling" it to promises, it would resolve a complex scenario like following (the demo could be improved, of course):
// it would resolve a tree of parallel and series like following with traditional promises
Promise.all([p1(), p3()]) .then((x1, x3) => p2(x1) .then(x2 => p10(x2, x3) .then(x10 => { let x4, x5, x6
return Promise.all([p4(x1, x2), p5(x2, x3)])
.then(results => [x4, x5] = results)
.then(() => p6(x4, x5, x10))
.then(_x6 => x6 = _x6)
.then(() => {
let x7, x8, x9
return Promise.all([p7(x4, x6), p9(x5, x6)])
.then(results => [x7, x9] = results)
.then(() => p8(x6, x7))
.then(_x8 => x8 = _x8)
.then(() => p11(x8, x9))
})
})
)
)
What if there are series awaits before the first parallel await?
no problem, if there was one before, then in it would be chained first in the outer promise, and following it would be the promise.all for the next parallel group.
What if the parallel awaits depend on the results of those series awaits? It reads very strangely.
that's totally valid as far as the parallel await is located after the series awaits.
by the way, i think it can also be improved without the need of async {}
blocks, just marking together the parallel awaits (await||
) and write the
code like in series, but grouping the parallel awaits when transpiling:
// NOTE async {} = asynchronous scope (it groups parallel awaits => await||
)
// NOTE p? = function call that returns a promise (? = just and index)
const x0 = await p0()
const x11 = f11() // sync code in-the-middle
const x1 = await || p1(x0) const x3 = await || p3(x11) const x2 = await p2(x1) const x10 = await p10(x2, x3)
const x4 = await || p4(x1, x2) const x5 = await || p5(x2, x3) const x6 = await p6(x4, x5, x10)
const x7 = await || p7(x4, x6) const x9 = await || p9(x5, x6) const x8 = await p8(x6, x7)
await p11(x8, x9)
// it would resolve a tree of parallel and series like following with traditional promises
p0 .then(x0 => { const x11 = f11()
return Promise.all([p1(x0), p3(x11)])
.then((x1, x3) =>
p2(x1)
.then(x2 =>
p10(x2, x3)
.then(x10 =>
Promise.all([p4(x1, x2), p5(x2, x3)])
.then((x4, x5) =>
p6(x4, x5, x10)
.then(x6 => Promise.all([p7(x4,
x6), p9(x5, x6)]) .then((x7, x9) => p8(x6, x7) .then(x8 => p11(x8, x9)) ) ) ) ) ) ) })
I think your idea is too much magic, in the sense of not being straightforward to understand. It actually executes subsequent code first, e.g.
async { x1 = await || doStuffAsync1(); x2 = await doStuffAsync2(); x3 = await || doStuffAsync3(); }
In your idea, doStuffAsync3()
is executed and completed before
doStuffAsync2() even begins! I think this would add bugs due to the
confusion, more particularly if outside variables were modified and relied
upon.
Why not just maximally preserve current JavaScript for parallel execution,
just by omitting await
in multiple async calls, simply wrapping it in an
await.all
block to ensure completion before code continues past the
block. This surely is the more straightforward way to satisfy the same
goals?
Why not just maximally preserve current JavaScript for parallel execution, just by omitting
await
in multiple async calls, simply wrapping it in anawait.all
block to ensure completion before code continues past the block. This surely is the more straightforward way to satisfy the same goals?
because wrapping it an await.all
on the one hand explicitly groups
promises, but brings complexity on returning values assignment, specially
when is about constants (const
). so, if you avoid blocks, just marking
parallel awaits with, for example, a suffix await||
, or whatever other
more convenient way, you can just write the code in series as normally, and
avoid that complexity. the transpiler would just require to group the
consecutive marked parallel awaits (await||
) into a Promise.all() and
that's it. following i reproduce the demo before:
// NOTE p? = function call that returns a promise (? = just and index)
// NOTE s? = function call that runs synchronously and returns a value
(? = just and index)
const x0 = await p0()
const x11 = s11() // sync code in-the-middle
const x1 = await || p1(x0)
const x3 = await || p3(x11)
const x2 = await p2(x1)
const x10 = await p10(x2, x3)
const x12 = s12() // sync code in-the-middle
const x4 = await || p4(x1, x2)
const x5 = await || p5(x2, x3, x12)
const x6 = await p6(x4, x5, x10)
const x7 = await || p7(x4, x6)
const x9 = await || p9(x5, x6)
const x8 = await p8(x6, x7)
await p11(x8, x9)
// it would resolve a tree of parallel and series like following with
traditional promises
p0
.then(x0 => {
const x11 = f11()
return Promise.all([p1(x0), p3(x11)])
.then((x1, x3) =>
p2(x1)
.then(x2 =>
p10(x2, x3)
.then(x10 => {
const x12 = s12()
return Promise.all([p4(x1, x2), p5(x2, x3,
x12)])
.then((x4, x5) =>
p6(x4, x5, x10)
.then(x6 => Promise.all([p7(x4,
x6), p9(x5, x6)])
.then((x7, x9) => p8(x6, x7)
.then(x8 => p11(x8, x9))
)
)
)
})
)
)
})
OK I'm even more confused now. x1 is surely not a resolved value until all the next "non parallel await" so is it "undefined" until then?
Could you give an example of what you mean by the await.all { ... }
block
syntax bringing "complexity on returning values assignment, specially when"
"about constants (const
)", as I'm unclear what you are referring to.
OK I'm even more confused now. x1 is surely not a resolved value until all the next "non parallel await" so is it "undefined" until then?
as a const
x1 does not exist until those parallel awaits await||
(for
p1 and p3) are resolved (same for x3). then p2 is resolved after that.
what it tries to bring is a simplification of syntax.
Could you give an example of what you mean by the await.all { ... }
block
syntax bringing "complexity on returning values assignment, specially when" "about constants (
const
)", as I'm unclear what you are referring to
const
and let
are block-scoped, so this is referring to avoid that
block notation, and then keep coherent with the possibility to directly
infer in current scope references without extra segments / blocks.
however, i find yet things to solve with it, like how to face two consecutive and depending parallel blocks (no series awaits neither sync statements in the middle). for that, the initial proposal using arrays would fit, but i understand is what you trying to avoid.
const [x1, x2] = await|| [p1(), p2()]
const [x3, x4] = await|| [p3(x1), p4(x2)]
... do whatever with x3 and x4
If I have, as per your examples,
x1 = await||actionAsync1() x2 = await||actionAsync2(x1) //x1 is undefined here, only resolved on the next "non-parallel-await"
vs adding a line between the two calls:
x1 = await||actionAsync1() let c; x2 = await||actionAsync2(x1)
...does the let c
automatically break the parallel grouping since it's a
non-parallel operation (thereby making x1 defined)?
It seems to me like you are doing block logic without blocks, which I think
increases the chances of bugs. Also you're not leveraging the existing
parallel execution pattern for async functions (which is just to omit the
await
), so it would seem you would be increasing learning overhead. And,
you're not really allowing for submitting more sophisticated mixtures of
serial async, parallel async and synchronous code for "parallel completion"
guarantee, by requiring that parallel calls be "grouped" together in terms
of lines of code, almost allowing for nothing beyond the current
"Promise.all" pattern, logically. I don't think this is satisfying the
original motivation.
For a awail.all { ... }
block, maybe allowing a "return"/"yield" value
could neaten up the block scope separation, but maybe that could be left to
the "do" expression if that were adopted. But I don't think it's a big
sacrifice if neither exist.
Just wanted to drop in and remind people of this by me earlier in the thread: esdiscuss.org/topic/proposal-await-all-for-parallelism#content-10
The way things are shaping up, it's starting to look like an ad-hoc version of this proposal of mine: isiahmeadows/non-linear-proposal
As I stated earlier, I feel it's premature, especially before we figure out how observables fit into it all.
I don't know that an await.all { ... }
block would be premature,
especially since it's straightforward, so I can't see it clashing with
anything in the future e.g. on the "observables" front, if that were to
become a thing. If the semantics of await
were to be extended somehow,
then identically and naturally would the semantics of await.all { ... }
.
It seems to me like you are doing block logic without blocks, which I think increases the chances of bugs.
I agree. Without curly braces, it's not always clear when the parallel code is guaranteed to have executed by. The first version of my proposal did something similar:
const a = await|| doSomethingAsync();
const b = await|| doSomethingElseAsync();
const [aValue, bValue] = await async.all;
...where async.all
represents something like
Promise.all(promises)
. The problem is, if you forget the await async.all
then your promises never execute (within the function), so
accidentally using await||
instead of await
would have the
opposite of the intended effect.
async { await|| ... }
sidesteps both of these issues: it makes it
clear when the promises have all settled by, and if await
isn't
allowed in the curly brackets then it avoids the "wrong operator"
confusion issue as well.
you're not leveraging the existing parallel execution pattern for async functions (which is just to omit the
await
), so it would seem you would be increasing learning overhead.
If await||
(or whatever) is learned in conjunction with async {}
blocks, and is only allowed within them, then it just becomes "this is
the syntax for parallelism." And as already stated, you'd need an
explicit marker for which expressions are being awaited in parallel
for any reasonable transpilation.
const
andlet
are block-scoped, so this is referring to avoid that block notation, and then keep coherent with the possibility to directly infer in current scope references without extra segments / blocks.
Yeah, it'd be nice to have a syntax that doesn't create a block scope. But that strikes me as less important than making it obvious where the nondeterminism is.
I assume making these braces not create a block scope is unacceptable.
And using an alternative bracket (maybe async [ await|| foo ]
) would
be too different from everything else in the language and might read
as an array, which discourages using non-expression statements inside
it
OK but would await||
return a promise? If so, then it would seem
redundant compared to just omitting the await
, as it would offer nothing
different, and again, something new to learn for the same logical
behaviour. Otherwise, it can only really return undefined
, which would
seem inconstent with both using await
and omitting await
. Therefore, I
would recommend just omitting await inside an await.all block as the
pattern for doing “await until all done” parallelism.
To simplify the problem of working with promises in parallel, I propose this new syntax:
async function initialize() { let foo, bar, baz; await.all { foo = (await request('foo.json')).data; bar = (await request('bar.json')).data; baz = (await request('baz.json')).data; } render(foo, bar, baz); }
Each child statement of the curly braces is evaluated in parallel and execution resumes when they've all resolved.
The Problem: with current syntax, the above function would probably look something like this:
async function initialize() { const [ { data: foo }, // renaming response.data => foo { data: bar }, { data: baz }, ] = await Promise.all([ request('foo.json'), request('bar.json'), request('baz.json'), ]); render(foo, bar, baz); }
For this kind of use case,
Promise.all
leads to "parallel lists" of promises and their return values, which must be kept in sync. Using those values either requires (sometimes deep) destructuring or temporary variables.This structure is also just fundamentally different from working serially in async/await and it forces you to reason about the problem in a specific way. This doesn't appear to be a conscious decision to force good code practices, it's just a limitation that falls naturally out of the current syntax. Thus, we have an opportunity to shift some of the burden back to the language with this new syntax.
Here's the full proposal: mrjacobbloom/proposal-await-all -- let me know what you think!