Improving detachment for array buffers
# Domenic Denicola (11 years ago)
It looks like Dave is taking another approach to this problem with a SharedArrayBuffer idea: blog.mozilla.org/javascript/2015/02/26/the-path-to-parallel-javascript
He then builds something similar to what I want on top of it: gist.github.com/dherman/5463054
Interesting times.
It looks like Dave is taking another approach to this problem with a SharedArrayBuffer idea: https://blog.mozilla.org/javascript/2015/02/26/the-path-to-parallel-javascript/
He then builds something similar to what I want on top of it: https://gist.github.com/dherman/5463054
Interesting times.
-----Original Message-----
From: Domenic Denicola
Sent: Wednesday, February 25, 2015 16:20
To: es-discuss at mozilla.org
Cc: Chris Wilson; Dmitry Lomov
Subject: Improving detachment for array buffers
We're running into some tricky issues trying to design readable byte streams, related to preventing observable data races but also minimizing the number of copies. I wrote up a more full explanation of the problem at [1] for those interested. Apparently similar problems have been encountered by a variety of web specs, including web audio, EME, MSE, web crypto, and DataCues.
The basic problem is that we need a way to detach a section of an array buffer, so that another thread can write to it without being observed, and then un-detach it when the other thread is done. This would allow us to write code like
```js
const ab = new ArrayBuffer(1024);
readableByteStream.readInto(ab, 0, 256).then(bytesRead => {
// here ab bytes 0-256 can be accessed
// bytes 0-bytesRead (often 0-256) will have the data });
// here ab bytes 0-256 are detached---cannot be accessed, presumably throwing upon access ```
What do we think? Are there technical reasons that make detaching sections of an array buffer a bad idea or hard to implement? Or is this a plausible future?
NOTE: The proposed-for-ES2016 [ArrayBuffer.transfer][2] seems likely motivated by similar issues. It can get us part of the way there, but the result is awkward:
```js
const ab = new ArrayBuffer(1024);
readableByteStream.readInto(ab, 0, 256).then(({ result, bytesRead }) => {
// here ab is (still) completely detached
// but result is a 1024-byte ArrayBuffer with bytes 0-bytesRead having the data });
// here ab is completely detached, and no longer usable ```
This gets kind of ridiculous when you imagine reading more than one segment into the array buffer: for each segment you have to pass in the currently-non-detached array buffer, and you get back a new array buffer variable you need to pass along---even though the entire time you are working with the same backing memory, just transferred between the different JS variables.
[1]: https://gist.github.com/domenic/de5e63ae8b1b8b8c3f6c
[2]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/transfer
We're running into some tricky issues trying to design readable byte streams, related to preventing observable data races but also minimizing the number of copies. I wrote up a more full explanation of the problem at 1 for those interested. Apparently similar problems have been encountered by a variety of web specs, including web audio, EME, MSE, web crypto, and DataCues.
The basic problem is that we need a way to detach a section of an array buffer, so that another thread can write to it without being observed, and then un-detach it when the other thread is done. This would allow us to write code like
const ab = new ArrayBuffer(1024); readableByteStream.readInto(ab, 0, 256).then(bytesRead => { // here ab bytes 0-256 can be accessed // bytes 0-bytesRead (often 0-256) will have the data }); // here ab bytes 0-256 are detached---cannot be accessed, presumably throwing upon accessWhat do we think? Are there technical reasons that make detaching sections of an array buffer a bad idea or hard to implement? Or is this a plausible future?
NOTE: The proposed-for-ES2016 ArrayBuffer.transfer seems likely motivated by similar issues. It can get us part of the way there, but the result is awkward:
const ab = new ArrayBuffer(1024); readableByteStream.readInto(ab, 0, 256).then(({ result, bytesRead }) => { // here ab is (still) completely detached // but result is a 1024-byte ArrayBuffer with bytes 0-bytesRead having the data }); // here ab is completely detached, and no longer usableThis gets kind of ridiculous when you imagine reading more than one segment into the array buffer: for each segment you have to pass in the currently-non-detached array buffer, and you get back a new array buffer variable you need to pass along---even though the entire time you are working with the same backing memory, just transferred between the different JS variables.
We're running into some tricky issues trying to design readable byte streams, related to preventing observable data races but also minimizing the number of copies. I wrote up a more full explanation of the problem at [1] for those interested. Apparently similar problems have been encountered by a variety of web specs, including web audio, EME, MSE, web crypto, and DataCues. The basic problem is that we need a way to detach a section of an array buffer, so that another thread can write to it without being observed, and then un-detach it when the other thread is done. This would allow us to write code like ```js const ab = new ArrayBuffer(1024); readableByteStream.readInto(ab, 0, 256).then(bytesRead => { // here ab bytes 0-256 can be accessed // bytes 0-bytesRead (often 0-256) will have the data }); // here ab bytes 0-256 are detached---cannot be accessed, presumably throwing upon access ``` What do we think? Are there technical reasons that make detaching sections of an array buffer a bad idea or hard to implement? Or is this a plausible future? NOTE: The proposed-for-ES2016 [ArrayBuffer.transfer][2] seems likely motivated by similar issues. It can get us part of the way there, but the result is awkward: ```js const ab = new ArrayBuffer(1024); readableByteStream.readInto(ab, 0, 256).then(({ result, bytesRead }) => { // here ab is (still) completely detached // but result is a 1024-byte ArrayBuffer with bytes 0-bytesRead having the data }); // here ab is completely detached, and no longer usable ``` This gets kind of ridiculous when you imagine reading more than one segment into the array buffer: for each segment you have to pass in the currently-non-detached array buffer, and you get back a new array buffer variable you need to pass along---even though the entire time you are working with the same backing memory, just transferred between the different JS variables. [1]: https://gist.github.com/domenic/de5e63ae8b1b8b8c3f6c [2]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/transfer