Brendan Eich (2013-07-15T16:01:25.000Z)
Mark S. Miller wrote:
> On Sun, Jul 14, 2013 at 10:39 PM, Brendan Eich <brendan at mozilla.com 
> <mailto:brendan at mozilla.com>> wrote:
>
>     Mark S. Miller wrote:
>
>         First, I wholeheartedly agree. JS is increasingly being used
>         as a target of compilation. When I asking people doing so what
>         they biggest pain point is, the lack of support for integers
>         is often the first thing mentioned.
>
>
>     int64/uint64 come up faster when compiling from C/C++. I've
>     prototyped these in SpiderMonkey in a patch that I'm still
>     rebasing, but planning to land pretty soon:
>
>     https://bugzilla.mozilla.org/show_bug.cgi?id=749786
>
>
> I don't think we should introduce precision limited integers into JS, 
> ever. The only point would be to do something weird on overflow of the 
> precision limit. The traditional C-like weirdness is wrapping -- the 
> only advantage being that it avoids even needing to check for overflow.

We already have precision limited integers in JS, both via the 
bitwise-logical and shift operators, and via typed arrays / binary data. 
We need 64-bit ints for the latter, scalar and SIMD-vector as well as 
arbitrary length typed array element type.

This ship has sailed, bignums are not equivalent or a superset. These 
are in the hardware, they need to be in JS for the "memory safe 
low-road" role it plays for WebGL, asm.js, etc.

>         Were we not introducing TypedArrays until ES7, would we have
>         typedArray.length be a bignum rather than a floating point
>         number? If so, is there anything we can do in ES6 to leave
>         ourselves this option in ES7?
>
>
>     That would be unwanted, overkill. All typed arrays want is memory
>     capacity, and 64 bits is more than enough. Even 53 is enough, and
>     that's where ES6 is going, last I talked to Allen. People do want
>     >4MB typed arrays.
>
>
>
> When a bignum represents a small integer, there's no reason that it 
> needs to be any slower than a JS floating point number representing a 
> small integer. Most JS implementations already optimize the latter to 
> store the small integer in the "pointer" with a tag indicating that 
> the non-tag bits are the small integer value. Exactly the same trick 
> has been used for bignums in Lisps and Smalltalks for many decades.

Sure, I'm familiar -- but again those older implementations did not face 
the performance constraints of the "low road" that JS does, nor were 
they were quite so aggressively optimized.

I'm not saying bignums couldn't be used as doubles are today -- they could.

Rather, JS has number and always will, and the speculations toward int 
are sunk cost (good sunk cost in a practical sense). Waiting for bignums 
to come in before typed arrays, just to support types bigger than memory 
in any foreseeable future, and get the same optimizations as number, 
does not fly. We won't wait in ES6 -- we will be lucky to get 
integer-domain double instead of uint32 lengths for typed arrays if not 
arrays.

> As long as the numbers represent small integers, I think the only 
> differences would be semantics, not performance.

Both, in the "short run", in practice. But we aren't even doing bignums 
in ES6, and we are doing typed arrays / binary data. We need to satisfy 
 >4G memory buffer use cases now, and be future-proof. 53 bits is enough.

/be
domenic at domenicdenicola.com (2013-07-17T19:06:27.929Z)
Mark S. Miller wrote:
> I don't think we should introduce precision limited integers into JS, 
> ever. The only point would be to do something weird on overflow of the 
> precision limit. The traditional C-like weirdness is wrapping -- the 
> only advantage being that it avoids even needing to check for overflow.

We already have precision limited integers in JS, both via the 
bitwise-logical and shift operators, and via typed arrays / binary data. 
We need 64-bit ints for the latter, scalar and SIMD-vector as well as 
arbitrary length typed array element type.

This ship has sailed, bignums are not equivalent or a superset. These 
are in the hardware, they need to be in JS for the "memory safe 
low-road" role it plays for WebGL, asm.js, etc.

> When a bignum represents a small integer, there's no reason that it 
> needs to be any slower than a JS floating point number representing a 
> small integer. Most JS implementations already optimize the latter to 
> store the small integer in the "pointer" with a tag indicating that 
> the non-tag bits are the small integer value. Exactly the same trick 
> has been used for bignums in Lisps and Smalltalks for many decades.

Sure, I'm familiar -- but again those older implementations did not face 
the performance constraints of the "low road" that JS does, nor were 
they were quite so aggressively optimized.

I'm not saying bignums couldn't be used as doubles are today -- they could.

Rather, JS has number and always will, and the speculations toward int 
are sunk cost (good sunk cost in a practical sense). Waiting for bignums 
to come in before typed arrays, just to support types bigger than memory 
in any foreseeable future, and get the same optimizations as number, 
does not fly. We won't wait in ES6 -- we will be lucky to get 
integer-domain double instead of uint32 lengths for typed arrays if not 
arrays.

> As long as the numbers represent small integers, I think the only 
> differences would be semantics, not performance.

Both, in the "short run", in practice. But we aren't even doing bignums 
in ES6, and we are doing typed arrays / binary data. We need to satisfy >4G memory buffer use cases now, and be future-proof. 53 bits is enough.