Additional language features

# Christian Mayer (15 years ago)

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256

Hello together!

Currently I'm writing a (for me) large project that is heavily using JavaScript / ECMAScript. During that project I found a few features missing in the language that could easily be added and where I think that many programmers could profit from:

  1. A printf compatible format string ====================================

For example the String object could be extended by a sprintf type of function / method that takes a printf compatible format sting and the additional values. Currently there are lots of libraries that provide that functionality - but all cover only a small subset and it's not know which are in good shape or not...

  1. A binary type conversion including float ===========================================

My project is using AJAX technology to transmit measurement data over the net in JSON notation. This data contains IEEE float values, encoding their bit and byte representation in hex values. In the browser I need to convert that hex string back into a float value

    • which is quite complicated as I have to implement an IEEE 754 "parser".

Especially for the use with WebSockets it would be a great help to have a functionality that allows to pack and unpack binary data into ECMAScript objects.

A possible syntax (and much better description of what I need) give the "pack" and "unpack" commands in the Perl language.

  1. A fast library for small, fixed size vectors (numerical arrays) ==================================================================

For 2D and 3D purposes it would be great to have a data type / object that is specialized for 2D, 3D and 4D values. It might even internally map to a SIMD datatype if the CPU running the interpreter is supporting it (e.g. a SSE type value for x86 processors; other CPU architectures have similar extensions). Especially for the Canvas and WebGL it would be great to have such a data type.

For this data type it would be great to have a library supporting it and provide linear algebra functionality (matrix multiplication, etc.) The C++ library Eigen2 provides everything that is necessary (and even more...)

So I hope I didn't write my ideas to the wrong list (if it is so, please correct me!). And I hope you can tell me if those additions could make it on the next spec of ECMAScript.

Thanks, Christian Mayer

-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux)

iEYEAREIAAYFAk1yPXAACgkQoWM1JLkHou25NACdHNkx3GwTMtd34mPtbIunb12G 9VsAoJQMgxluvsNPuRoulnBzsTAap55C =NbgV -----END PGP SIGNATURE-----

# Brendan Eich (15 years ago)

On Mar 5, 2011, at 5:41 AM, Christian Mayer wrote:

  1. A printf compatible format string

strawman:string_format, strawman:quasis

We discussed both at the last meeting and quasis is on the agenda for the next one. Mike or Mark might want to say more.

  1. A binary type conversion including float
  1. A fast library for small, fixed size vectors (numerical arrays)

strawman:typed_arrays (part of WebGL) strawman:binary_data (proposed for ES Harmony)

Dave Herman has even prototyped binary data on top of typed arrays:

dherman/structsjs

# Christian Mayer (15 years ago)

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256

Thanks for the fast response!

Am 05.03.2011 17:31, schrieb Brendan Eich:

On Mar 5, 2011, at 5:41 AM, Christian Mayer wrote:

  1. A printf compatible format string

strawman:string_format, strawman:quasis

Both handle nicely the placement of "external" values into a string at given positions.

But this is only one part of an printf-format string. What I'm still missing is the feature to define the format of the inserted data. This is for me more important, as that can't be easily achieved by the current ECMAScript.

Here I think of stuff like:

sprintf( "%f", 1.2345 ) => "1.2345"

sprintf( "%.2f", 1.2345 ) => "1.23"

sprintf( "%5.2f", 1.2345 ) => " 1.23"

sprintf( "%05.2f", 1.2345 ) => "01.23"

sprintf( "%+05.2f", 1.2345 ) => "+1.23"

sprintf( "%e", 1.2345 ) => "1.2345e0" ...

The big advantage of the printf style format string is, that (nearly?) every programmers knows it and there are uncountable references, explanations, tutorials on the net.

Oh, and it could be easily combined with the approaches above, I guess.

  1. A binary type conversion including float
  1. A fast library for small, fixed size vectors (numerical arrays)

strawman:typed_arrays (part of WebGL) strawman:binary_data (proposed for ES Harmony)

That's going the right way :)

But I miss the linear algebra library to go with it. Especially for the "binary data" approach, as it's removing an order that might be implicitly known - sorry, I don't know how to express that better, so I'll make an example:

// A point or vector in projective geometry - as usually used for 3D // and advanced 2D: const Vec3D = new StructType({ x: float32, y: float32, z: float32, w: float32 });

This is valid, as everybody knows that the parts of that vector are named x, y, z and w. But that vector is for everybody identical to an array with 4 elements that maps to:

a = new ArrayType(float32, 4) a[0] = Vec3D.x a[1] = Vec3D.y a[2] = Vec3D.z a[3] = Vec3D.w

This array allows the use of normal linear algebra algorithms and to translate, scale, project, ... by multiplying a 4x4 matrix to that vector.

But we could also define the vector as: const OtherVec3D = new StructType({ x: float32, z: float32, w: float32, y: float32 });

The OtherVec3D would also be recognized by every programmer as a valid data type for the intended usecase - but it's internal order doesn't fit to any usual convention in linear algebra and thus no mapping to an array which could efficently be used for a linear algebra lib (which hopefully uses the SIMD instructions of the CPU).

My suggestion is to create a "duality" for access, i.e.: v = new Vec3D; v.x = 1.0; v[0] == v.x; // -> true

=> This allows access of the elements in the way that is the best in

that current situation. (A programmer would use the ".x"-notation usually to extract a specific value, and the "[]"-notation in algorithms)

=> But this would also require that the order of the elements in the {...} of the StructType({...}) is 1:1 mapped to the array positions. (AFAIK is currently the order of elements in an Object undefined although all browsers seem to keep the initialisation order)

And, to make sure that a highly optimized implementation (SIMD instructions...) is possible, the language standard should predefine the most common types (i.e. Vec2Dfloat, Vec3Dfloat and Vec4Dfloat) and supply a library to handle those (especially the scalar product, matrix vector product and the matrix matrix product)

CU, Christian Mayer

-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux)

iEYEAREIAAYFAk1ykUMACgkQoWM1JLkHou2DGACeNuc4r2Sgy0XVY9/71dDCx90s zC0AniGv2NlybWXM9QMp5AEj5ZXAgiIL =VzC1 -----END PGP SIGNATURE-----

# P T Withington (15 years ago)

On 2011-03-05, at 14:38, Christian Mayer wrote:

Am 05.03.2011 17:31, schrieb Brendan Eich:

On Mar 5, 2011, at 5:41 AM, Christian Mayer wrote:

  1. A printf compatible format string

strawman:string_format, strawman:quasis

Both handle nicely the placement of "external" values into a string at given positions.

But this is only one part of an printf-format string. What I'm still missing is the feature to define the format of the inserted data. This is for me more important, as that can't be easily achieved by the current ECMAScript.

FWIW, OpenLaszlo has a fairly complete implementation of 'printf':

svn.openlaszlo.org/openlaszlo/trunk/WEB-INF/lps/lfc/compiler/LzFormatter.lzs

created long before the quasis proposal (which is quite nice). I agree there needs to be a way to direct how the value is interpolated, and I suspect this is what the comment "Meta data can be attached using a syntactic convention chosen by the quasi function" is meant to hint at. An example along the lines of printf format control might be nice...

# David Herman (15 years ago)

But I miss the linear algebra library to go with it.

Can you send references to example libraries for other systems that you would like to see?

Especially for the "binary data" approach, as it's removing an order that might be implicitly known - sorry, I don't know how to express that better, so I'll make an example:

// A point or vector in projective geometry - as usually used for 3D // and advanced 2D: const Vec3D = new StructType({ x: float32, y: float32, z: float32, w: float32 });

I'm not sure if the wiki page mentions this, but objects are ordered in JavaScript, and the intention of the spec is to exploit this. The data would be laid out in order, so that for example when copying the data to a WebGL buffer, you would get them in the order they appear in the type.

My suggestion is to create a "duality" for access, i.e.: v = new Vec3D; v.x = 1.0; v[0] == v.x; // -> true

This seems like a reasonable convenience, and maybe also a way to enforce the ordering of struct fields a little more explicitly in the spec.

(As an aside: part of the purpose of the binary data spec was to avoid casting. By giving programmers a more expressive library of type descriptors, they can create compound types that express the kinds of data needed for applications like WebGL. And we'd really rather avoid providing cast operations, since they expose platform-specific characteristics like endianness, leading to portability hazards. So for that reason, I'm loath to allow struct types to be cast to array types. But I don't think there's any harm in allowing indexed access to struct members.)

=> But this would also require that the order of the elements in the {...} of the StructType({...}) is 1:1 mapped to the array positions.

As I say, that was already the intention.

(AFAIK is currently the order of elements in an Object undefined although all browsers seem to keep the initialisation order)

This is something we intend to specify explicitly in the next edition of the spec. See:

http://wiki.ecmascript.org/doku.php?id=strawman:enumeration

And, to make sure that a highly optimized implementation (SIMD instructions...) is possible, the language standard should predefine the most common types (i.e. Vec2Dfloat, Vec3Dfloat and Vec4Dfloat) and supply a library to handle those (especially the scalar product, matrix vector product and the matrix matrix product)

Yeah, a matrix library would be nice to have. OTOH, we tend to avoid putting too many libraries in the standard, preferring instead to let the community experiment with libraries. Your point about optimization is good, though: a matrix library could be highly optimized by the JS engines, more so than many other kinds of libraries. Still, in the end it comes down to resources: do we have the time and people to spec the library; do implementors have the resources to implement the library; or does someone have the resources to write an open source implementation that could potentially be shared by the many VM's...

Thanks for your feedback!

Best,

# Christian Mayer (15 years ago)

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256

Am 05.03.2011 23:38, schrieb David Herman:

But I miss the linear algebra library to go with it.

Can you send references to example libraries for other systems that you would like to see?

A big favourite of mine (I'm biased, though...) is the Eigen2 library (LGPL3+): eigen.tuxfamily.org/index.php?title=Main_Page

Using the small, fixed size subset of that lib and exporting the interface to ECMAScript should give perfect coverage - and an optimal implementation.

But looking at the (much older) "SG" lib of PLIB: plib.sourceforge.net/sg/index.html gives a nice overview of every function that is needed from linear algebra for doing 3D graphics work. (It was created for OpenGL as it came to the personal computers)

If the licence of the ECMAScript implementation allows it, I would just take Eigen2 (or Eigen3)...

And, to make sure that a highly optimized implementation (SIMD instructions...) is possible, the language standard should predefine the most common types (i.e. Vec2Dfloat, Vec3Dfloat and Vec4Dfloat) and supply a library to handle those (especially the scalar product, matrix vector product and the matrix matrix product)

Yeah, a matrix library would be nice to have. OTOH, we tend to avoid putting too many libraries in the standard, preferring instead to let the community experiment with libraries. Your point about optimization is good, though: a matrix library could be highly optimized by the JS engines, more so than many other kinds of libraries. Still, in the end it comes down to resources: do we have the time and people to spec the library; do implementors have the resources to implement the library; or does someone have the resources to write an open source implementation that could potentially be shared by the many VM's...

Exactly. It might be possible to use the prototype functionality to provide a smooth fallback for implementations that don't provide that lib yet. But this library would extremely profit from a native implementation.

BTW: The big break though the GeForce graphic cards brought, was that those operations went from the CPU to the GPU. So I think it would bring a big speed increase to bring that from interpreted to compiled code...

As I wrote above, look at Eigen2 and just use it - it even comes with SIMD support for best CPU utilisation and performance. A specialisation for 2D, 3D and 4D as well as the Dynamic for the generic case should do it. All that you need (and even more...) should be documented at the page eigen.tuxfamily.org/dox-devel/QuickRefPage.html

If you include "Core" and perhaps "Geometry" you've got everything that you need - at a very short time. (Adding "LU" you've got enough to make 99% happy.)

Thanks for your feedback!

Thanks for the good work.

(And I'm looking forward to use those features in my project! ;)

CU, Christian

-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux)

iEYEAREIAAYFAk1yxCIACgkQoWM1JLkHou1S5QCdHHLrVRQqk07YSqguuEGh8DUq 0FEAn22WkJZDNzN69w8SPTOTmEQFLSl1 =oOY8 -----END PGP SIGNATURE-----

# David Herman (15 years ago)

A big favourite of mine (I'm biased, though...) is the Eigen2 library (LGPL3+):

I can't speak for other browser vendors, but I think that license isn't compatible with Mozilla's codebase. But thanks for the reference.

Using the small, fixed size subset of that lib and exporting the interface to ECMAScript should give perfect coverage

Can you be more explicit about what you mean by "perfect coverage?" What set of use cases are you trying to address? Are we talking specifically 3D graphics? If that's the case, it's not clear whether that's really appropriate for the ECMAScript spec. As I say, we standardize very few libraries, usually just the smallest set needed to provide core functionality.

Just looking around, there are already lots of WebGL matrix libraries out there:

http://www.google.com/search?q=webgl+matrix+library

BTW: The big break though the GeForce graphic cards brought, was that those operations went from the CPU to the GPU.

Then again, WebGL is already exposing the GPU to JS programmers. IANA GPU expert, but can you not already farm out matrix math to the GPU via GLSL?

As I say, I'll look into this, but a standardized matrix library doesn't seem as high priority to me as the rest of the binary data spec.

# Alex Russell (15 years ago)

On Mar 5, 2011, at 3:59 PM, David Herman wrote:

A big favourite of mine (I'm biased, though...) is the Eigen2 library (LGPL3+):

I can't speak for other browser vendors, but I think that license isn't compatible with Mozilla's codebase. But thanks for the reference.

Using the small, fixed size subset of that lib and exporting the interface to ECMAScript should give perfect coverage

Can you be more explicit about what you mean by "perfect coverage?" What set of use cases are you trying to address? Are we talking specifically 3D graphics? If that's the case, it's not clear whether that's really appropriate for the ECMAScript spec. As I say, we standardize very few libraries, usually just the smallest set needed to provide core functionality.

Just looking around, there are already lots of WebGL matrix libraries out there:

www.google.com/search?q=webgl+matrix+library

BTW: The big break though the GeForce graphic cards brought, was that those operations went from the CPU to the GPU.

Then again, WebGL is already exposing the GPU to JS programmers. IANA GPU expert, but can you not already farm out matrix math to the GPU via GLSL?

Sure, but why? Latency to go to the GPU sucks, having to pack your data in GPU-specific structures sucks, etc., etc.

GLSL may only be a win if all the other work you want to do is also going to happen on-GPU.

As I say, I'll look into this, but a standardized matrix library doesn't seem as high priority to me as the rest of the binary data spec.

Dave


es-discuss mailing list es-discuss at mozilla.org, mail.mozilla.org/listinfo/es-discuss

-- Alex Russell slightlyoff at google.com slightlyoff at chromium.org alex at dojotoolkit.org BE03 E88D EABB 2116 CC49 8259 CF78 E242 59C3 9723

# Dave Herman (15 years ago)

Is this an argument for including a matrix library in the spec? 'Cause my point was just, if the rationale for including it is that host implementations could exploit the GPU, well, so could JS.

# Christian Mayer (15 years ago)

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256

Am 06.03.2011 00:59, schrieb David Herman:

Using the small, fixed size subset of that lib and exporting the interface to ECMAScript should give perfect coverage

Can you be more explicit about what you mean by "perfect coverage?" What set of use cases are you trying to address? Are we talking specifically 3D graphics? If that's the case, it's not clear whether that's really appropriate for the ECMAScript spec. As I say, we standardize very few libraries, usually just the smallest set needed to provide core functionality.

Just looking around, there are already lots of WebGL matrix libraries out there:

Linear algebra is much more than 3D (and 2D) graphics - although it's a very prominent use that currently has a big focus in web development.

The traditional usecase (but then with matrices and vectors of more than 4 dimensions) is to solve (linear) equation systems. This has so many uses that I wouldn't know where to start to explain them. They are not only used in mathematics but also in all other nature sciences (physics, chemistry, biology, engineering, ...), in economics, ...

That's why I want an ultra efficient but fixed size 2D, 3D and 4D specialisation and a fast arbitrary dimension implementation.

That there are lots of WebGL matrix libraries is supporting that point: it's highly needed and a missed feature.

Such a library has to cover the data types:

    • scalar (= normal number, nothing to do here)
    • vectors (= array; in the two variants of a n1 and a 1n matrix)
    • matrices (= 2 dimensional array)

These data-accesses:

    • select a vector or matrix element -> scalar
    • select a column or row of a matrix -> vector
    • transpose a vector -> vector
    • transpose a matrix -> matrix
    • "glue" enough vectors together-> matrix

And to cover these operations:

    • vector + vector = vector
    • matrix + matrix = matrix
    • scalar * vector = vector
    • scalar * matrix = matrix
    • (row-)vector * (column-)vector = scalar (i.e. scalar product)
    • (column-)vector * (row-)vector = matrix
    • (column-)vector * matrix = (column-)vector
    • matrix * (row-)vector = (row-)vector
    • matrix * matrix = matrix

This are the basic operations, all other operations build up on them.

The extended set of basic operations have piecewise multiplications of two vectors and two matrices.

This is also so basic that there is since ages (starting with Fortran...) an API for doing exactly that and lots of highly optimized libraries that follow that API. The key word is "BLAS". Well known implementations for that are:

Note: these libraries are extremely well optimized, but they target the arbitrary dimension case with lots of dimensions, so they are usually too slow for the small, fixed size case needed in graphics. Including such a library in a specific ECMAScript implementation would be a good start - but it would still require a bit of extra code that handles the small, fixed size case. (For doing those efficiently there's also heaps of information on the net, e.g. software.intel.com/en-us/articles/optimized-matrix-library-for-use-with-the-intel-pentiumr-4-processors-sse2-instructions)

To become a "great coverage" those basic operations should be extended by:

    • LU decomposition
    • cross product of two vectors in the 3D case

And a "perfect coverage" would include:

    • quaternions
    • SVD decomposition
    • doing decompositions in place and add matrix multiplications that handle those

All of that also isn't very new. Magic key words to search for existing libraries would be LAPACK.

BTW: The big break though the GeForce graphic cards brought, was that those operations went from the CPU to the GPU.

Then again, WebGL is already exposing the GPU to JS programmers. IANA GPU expert, but can you not already farm out matrix math to the GPU via GLSL?

Of course you try to do much matrix math on the GPU and modern OpenGL (thus WebGL also) helps you with that, e.g. by using GLSL. But, it's mostly one way, i.e. you write the shader, load it to the GPU and from now on just feed it the data. If you need results back, it's not good anymore. GPUs are optimized for stream processing - if you use it interactively you'll loose.

So you need the tools on the client to also work with the data. E.g. to prepare the data. Or to tell the GPU what to do exactly (example: to position an object in 3D you have to multiply a few matrices together till you have the final matrix that the GPU uses to multiply all vectors with). Or to do collision detection. Or ...

And doing 2D work (Canvas, SVG) wouldn't give you access to the GPU.

Just think of a totally different application: We've got a time series of data (e.g. profit per month for the last year) and want to display that (e.g. on a Canvas or with a SVG). That's currently easily possible. And now we want to show a trend line, i.e. a line that follows that data, to filter out the noise and give a bit of prediction of the future. The usual approach is to get a "least squares" line, i.e. a line where the squared error between the line position and the data position is minimal. If you do that now, it's quite tedious. If you've got the lib as outlined above, it's a piece of cake.

As I say, I'll look into this, but a standardized matrix library doesn't seem as high priority to me as the rest of the binary data spec.

The matrix library would need a good binary data structure to have full potential. And a binary data spec would soon raise the question for a matrix lib - as you've shown above, WebGL has already shown the need for one.

CU, Christian

-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux)

iEYEAREIAAYFAk1zag8ACgkQoWM1JLkHou2GKgCdEzK7vMJB3FPtOv8K/mJTfI7d 3bIAoJl/g3jkYDUNiy7KTzwVkRsQmElA =/OOX -----END PGP SIGNATURE-----

# Erik Corry (15 years ago)

2011/3/5 Christian Mayer <mail at christianmayer.de>:

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256

Hello together!

Currently I'm writing a (for me) large project that is heavily using JavaScript / ECMAScript. During that project I found a few features missing in the language that could easily be added and where I think that many programmers could profit from:

  1. A printf compatible format string ====================================

For example the String object could be extended by a sprintf type of function / method that takes a printf compatible format sting and the additional values. Currently there are lots of libraries that provide that functionality - but all cover only a small subset and it's not know which are in good shape or not...

This seems like a library and not a language issue. Why not just pick one, add the stuff you need, open source the result? Unless you are willing to wait several years for browsers to support your project you will have to do this anyway.

  1. A binary type conversion including float ===========================================

My project is using AJAX technology to transmit measurement data over the net in JSON notation. This data contains IEEE float values, encoding their bit and byte representation in hex values.

That seems like a poor decision for an interchange format. ASCII fp values work rather well. What is the space penalty for decimal vs hex of gzipped data once you have already taken the hit for the JSON overhead?

In the browser I need to convert that hex string back into a float value

    • which is quite complicated as I have to implement an IEEE 754 "parser".

Especially for the use with WebSockets it would be a great help to have a functionality that allows to pack and unpack binary data into ECMAScript objects.

A possible syntax (and much better description of what I need) give the "pack" and "unpack" commands in the Perl language.

  1. A fast library for small, fixed size vectors (numerical arrays)

How fast does this need to be. Did you already do benchmarks on modern browsers for the stuff you need.

==================================================================

For 2D and 3D purposes it would be great to have a data type / object that is specialized for 2D, 3D and 4D values. It might even internally map to a SIMD datatype if the CPU running the interpreter is supporting it (e.g. a SSE type value for x86 processors; other CPU architectures have similar extensions). Especially for the Canvas and WebGL it would be great to have such a data type.

For this data type it would be great to have a library supporting it and provide linear algebra functionality (matrix multiplication, etc.) The C++ library Eigen2 provides everything that is necessary (and even more...)

So I hope I didn't write my ideas to the wrong list (if it is so, please correct me!). And I hope you can tell me if those additions could make it on the next spec of ECMAScript.

In general I am of the opinion that if the language already offers the building bricks that you need to do your task then it requires some special reason why you can't just make a JS level library that fits your needs. Basically, you need to have tried it and found that it doesn't work for you for some fundamental reason. Speed is not necessarily an argument, since performance is improving all the time and improvements to the optimization of JS code is the rising tide that lifts all boats. Adding the library you need to the platform has several disadvantages:

  • It makes the platform large and unwieldy. This worsens download times, security surface, learning curve.
  • It takes years before you can rely on support.
  • It locks down the API prematurely, where JS based libraries are free to evolve.

The role of a language standards body is to say no most of the time. Anything else leads to a monster of a standard.

# Christian Mayer (15 years ago)

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256

Hello Erik,

Am 07.03.2011 10:46, schrieb Erik Corry:

2011/3/5 Christian Mayer <mail at christianmayer.de>:

  1. A printf compatible format string ====================================

This seems like a library and not a language issue. Why not just pick one, add the stuff you need, open source the result? Unless you are willing to wait several years for browsers to support your project you will have to do this anyway.

It's correct that this belongs into a lib and not the basic language syntax. But for me (please correct me if I'm wrong), the ECMAScript language comes with a library that contains stuff like the "String" object. Exactly there should this request go to.

BTW, I'm currently using already an external library for exactly that functionality. Searching the best lib for my project it seemed to me that I'm not the only one missing that feature - so I volunteered to tell "upstream" that there's a little (and, I guess quite easy to implement) thing missing that could help lots of people.

  1. A binary type conversion including float ===========================================

My project is using AJAX technology to transmit measurement data over the net in JSON notation. This data contains IEEE float values, encoding their bit and byte representation in hex values.

That seems like a poor decision for an interchange format. ASCII fp values work rather well. What is the space penalty for decimal vs hex of gzipped data once you have already taken the hit for the JSON overhead?

Please don't judge the world only by your own point of view - it's larger than that. There are many use cases that need to encode and decode binary data into native data structures. E.g. (binary) file handling.

Or in my project (binary) field bus traffic is converted by a little deamon as a gateway to a JSON structure (bytes converted to hex to stay in ASCII) so that my application can fetch it with the AJAX and COMET pattern. As the gateway can't know the context of the messages and if it's an integer value, a float, a string, ..., the JavaScript client has to decode them according to a given context.

To continue point 1) - yes I'm already using a lib for converting to IEEE 754 as I can't wait multiple years till each mobile device has a new enough JavaScript version...

  1. A fast library for small, fixed size vectors (numerical arrays)

How fast does this need to be. Did you already do benchmarks on modern browsers for the stuff you need.

Of course: jsperf.com/object-vs-array/3 But that doesn't cover this request, it scratches only on the surface.

After moving 2D and 3D paint events from the script interpreter to native code (and there currently from the CPU to hardware acceleration) it's the next logical step in increasing the computation performance.

This is something where an interpreted interpreter can gain extremely performance (just compare Matlab loops with Matlab matrix operations).

But also the JIT "interpreters" that are popular these days could take massive advantage of this language / library extension as those know in advance how to efficiently map it to CPU instructions. And it even gets better: this stuff is easily parallelisable to SIMD and (in the arbitrary dimension case) to threads when you take care of informations known in advance (i.e. at compile time of the interpreter itself) - but not so easy any more when done at JIT compiler runtime.

So I hope I didn't write my ideas to the wrong list (if it is so, please correct me!). And I hope you can tell me if those additions could make it on the next spec of ECMAScript.

In general I am of the opinion that if the language already offers the building bricks that you need to do your task then it requires some special reason why you can't just make a JS level library that fits your needs. Basically, you need to have tried it and found that it doesn't work for you for some fundamental reason.

I'm with you that it's a tough decision where to draw the line between a complete and a bloated library.

Let's revisit the three points:

  1. This could be "easily" done with the current language and library set. So it wouldn't fullfill your requirements. But: it would round of the current library with a feature that has a great use and is asked for by many programmers. As it's just one additional method I argue that it's worth it.

  2. This is very hard to do by the current language set. (Of course it's not impossible - but ECMAScript is a general purpose language so nearly everything is possible no matter how complicated it is...) Taking this hardness to do, it's easy implementation in the language of the interpreter (assuming C/C++, other languages probably also) and the current trend of web development (+ other uses of ECMAScript that aren't in the web context) I also argue that it's worth it's addition to the spec.

  3. The requested feature set can be easily achieved by writing a little pure ECMAScript library. (This could also be used as a fallback when the interpreter doesn't support it natively). But: this library takes care of usually speed critical and/or computational intense stuff that would profit extremely from a native implementation. And: This library would only be complete if the arithmetic operators would be overloaded - but that's currently not possible in ECMAScript.

A very important point for deciding if a feature should extend a language or not is for me to look at the "relevance". I.e. how many users (= programmers) would use it or not (and relate that to the other points that speak for or against a feature): Feature 1) is something that I can imagine to be used in most programs using ECMAScript. Feature 2) is far more specialized - but I see a bigger need in the near future and it's hard to get it done currently. Feature 3) has a huge potential. Not only the 2D and 3D world would eagerly need it, it also has uses in all sorts of sciences as I wrote in an other mail. And although it's easy to implement it in pure ECMAScript it has a massive speed gain by implementing it in native code.

Speed is not necessarily an argument, since performance is improving all the time and improvements to the optimization of JS code is the rising tide that lifts all boats.

I want to disagree here: although the big iron is getting bigger, lots of "small iron" appears. Just think of all the mobile devices that appeared during the last few years. For them not only speed is important but also energy efficiency. Ah, and those are getting more and more 2D and 3D applications (all the different "Apps" like games) written in ECMAScript...

Adding the library you need to the platform has several disadvantages:

  • It makes the platform large and unwieldy. This worsens download times, security surface, learning curve.
  • It takes years before you can rely on support.
  • It locks down the API prematurely, where JS based libraries are free to evolve.

An early lock down can be very good in some cases - e.g. to avoid that everyone heads into a different direction and has to do big refactoring (including changing proven algorithms) later on when that lib comes.

The role of a language standards body is to say no most of the time. Anything else leads to a monster of a standard.

I perfectly understand that. And I'm just offering my thoughts during my current project. Take them as feedback from the community and real world

    • or leave them. It's up to the standards body.

CU, Christian

-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux)

iEYEAREIAAYFAk11P+QACgkQoWM1JLkHou0ITgCfSC0S1GQYVoO7ePeld3nXiRAD CWgAn0nxSmDI10pjS8lQ711/+3MrIE2g =vt13 -----END PGP SIGNATURE-----