minutes, TC39 meeting Tues 5/22/2012

# David Herman (13 years ago)

These are a combination of my edits of Rick's notes from the first topic, my notes from subsequent topics, and then my reconstruction from memory of the conversation on the last topic. So for those present, please feel free to correct the record.

Dave


People:

BT: Bill Ticehurst, Microsoft LH: Luke Hoban, Microsoft BE: Brendan Eich, Mozilla DH: Dave Herman, Mozilla AWB: Allen Wirfs-Brock, Mozilla AR: Alex Russell, Google EA: Erik Arvidsson, Google MM: Mark Miller, Google RW: Rick Waldron, jQuery YK: Yehuda Katz, jQuery DC: Doug Crockford, eBay OH: Ollie Hunt, Apple

Topic: Binary data

DH: typed arrays were designed for two use cases:

  • shipping data from CPU -> GPU for WebGL
  • doing binary file/network I/O DH: former requires matching the endianness expected by the GPU for interpreting shader scripts (e.g. that read bytes as int32 or float64), so they designed it to use the system's native endianness DH: latter requires explicitly specifying endianness, which DataView API allows DH: but casting ArrayBuffers to different types exposes system endianness, and basically the entire web is implemented on little endian systems, so the web is being written with the assumption of little endian, despite not being specified DH: we should specify little endian. bi-endian systems (which many modern little-endian systems actually are) have HW support for little-endian; but big-endian systems can implement byte swapping by compiling shaders to do the byte swapping themselves YK: I agree; hard for devs to reason about endianness. expecting large use cases for binary data? DH: yes; e.g. crypto algorithms, optimizing bignum libraries; I wrote a "float explorer" that displays bit patterns of IEEE754 doubles by casting, and discovered after months that my explicit check for system little-endianness was flawed and always produced true, but I couldn't test so I only discovered the flaw by chance. AWB: agree that most devs will not understand endianness differences LH: do expect some big-endian UA's DH: right, in particular game consoles; some have hardware support for bi-endian; how robust that support is is unknown. but can still implement byte-swapping by compiling shaders. DH: right now, there's no one implementing big-endian browsers making the case that they can't implement little-endian, and meanwhile the web is being written to assume little-endian. YK: if we standardize little-endian, does it become impossible to discover the system's endianness? DH: yes, but we could add a feature-detection API that explicitly provides that information. you wouldn't need it though. DH: if real perf issues arise, willing to consider little-endian by default with explicit opt-in to big-endian. still possible to write unportable code (e.g. blind copy-paste programming), but at least less likely. but let's not solve the problem till we know we have it AWB: endianness is clear for integers, but what about floats? aren't there other variations e.g. what order the mantissa and exponent and sign go in? DH: not sure, but we should just standardize whatever format the web is using. AWB: yes. if you know you need big endian, you'll do the conversion DH: mostly that's probably just for I/O, where DataView does that for you automatically DH: standardize little endian? EA, YK: de facto standard. LH: I need to contact people on my end; there may become big-endian browsers DH: broadly speaking, little-endian has won in the hardware world. regardless, this is becoming the de facto standard. HW support will likely get better and better, and leaving it unspecified is solving a problem that doesn't exist, while creating issues of its own. BT, BE: DataView defaults to big-endian DH: yes, people are unhappy about that inconsistency. we could change DataView to default to little-endian for consistency. but in the world, big-endian has won as the network byte order, while little-endian has won as the CPU byte order. BE: a foolish consistency! DH: right-- the defaults are modelled on reality, and reality is lumpy BE: yes. <<provides some historical insight>>

YK: what observable effects would this change have on WebGL? DH: likely none. AFAIK when you create data for WebGL, you don't use casting in ways that care about endianness, you just ship to the GPU, and that's what's sensitive to the endianness YK: could we eliminate the observable effects by just converting all the data en masse? DH: no, not that simple; you have to know which bytes need to be swapped DC: I thought we were going to replace typed arrays. can we just leave it out? BE: they'll go to W3C. this is reality; we must embrace DH: we can embrace and then create better, more ergonomic extensions DC: tell me more DH: typed arrays are views over ArrayBuffers, which are buckets of bytes; we add a new view type of structs:

S = struct({
    x: uint8,
    y: uint8
});

S is not a struct object but a struct type. then:

x = new S

creates a new instance of this struct type

A = Array( S )

again, a type, not an instance

YK: that's a little confusing DH: whole idea is an API for creating new types, so there's a meta-level here that is inherent complexity; can certainly bikeshed API names later LH: idea we discussed before was making the .buffer null of a struct unless you explicitly constructed it wrapping an existing buffer DH: yes, and once we do that, we can add pointer types:

S = struct({ x: uint32, foo: object })

DH: for security, absolutely cannot expose the buffer to casting DH: also, enforce alignment constraints to maintain invariant that normal ArrayBufferViews never have unaligned access; will write this up and could use help checking my logic AWB: isn't there an inconsistency about when there's padding and when there isn't? DH: no, if you create an unaligned struct type:

S = struct({ x: uint8, y: uint32 })

then if you construct it fresh, you can't observe the padding, and if you wrap it around a buffer, you get an error:

o = new S; // can't access buffer O = new S(buf, 2) // error: unaligned type

DH: as always, you can do unaligned access via data view:

d = new DataView(buf, o)

v = d.get(S, 17) // unaligned read

DH: in that example, v is an object pointing to index 17 in buf, and property accesses do the proper logic to perform unaligned access LH: since strings are pointers, would also be good to be able to have a string type BE: Waldemar wanted that a while back DH: I think he was talking about inline strings, which is best treated as a byte array and use Josh Bell's encoding/decoding API (in progress); but I agree we should make it possible to have all JS values; string type is a no-brainer, +1 <<everyone generally favorable>>

Topic: Classes

LH: anything we standardize finalizes basic choice of syntax; maxmin decides we're going to have a new kind of body instead of trying to make class be a constructor-like concept AWB: we need to agree on something, so let's agree on this basic structure MM: constructor syntax is DOA for good reason, b/c of the scope issues; in my proposal, class parameters were in scope but methods were instance methods; but this was a non-starter for implementation LH: I think there are ways to mitigate: put constructor locals in scope but poison them RW: that's too confusing BE: howling, screaming wart MM: what's the advantage over maxmin? LH: syntactic weight is fairly significant AR: there's weight, but it's a syntactic entryway to more features LH: blocked off one key door, which is the constructor syntax which is the shortest and simplest of all possibilities MM: one of the big advantages of JS is simple lexical scope; confusion of poisoned names is more important cognitively and the shorter syntax doesn't compensate DH: what's the syntactic convenience? LH: one less level of indentation YK: at cost of clarity RW: and you added public! BE: which cost is worst, public keyword or not having the thing you're calling new on be hoisted to the top? DH: public keyword is almost outright hostile to programmers; goes completely against the rest of the way JS works YK: given that maxmin is so minimal, I would like some escape valves for making it something we can build on (extension hooks) AWB: we need consensus on base level first, before we go into extensions to the proposal AWB: could you live with maxmin alone? YK: no. I will receive pressure to use class syntax for my existing class systems, but without at least one escape valve, I will have to choose between saying no or saying yes and killing features AR: there'll be needs for things like mixins EA: that's been debunked; RHS is an AssignmentExpressions; mixins are perfectly possible AR: but there will be various things that can't be done AWB: you can do this with a function that implements your hook YK: that asks my users to call a special function MM: there's simply no proposal for this additional feature for us to evaluate AR: will you attempt to stone maxmin before they get out the gate before we can agree? YK: well, no LH: I won't try to stone classes, but I really hope if we're going to put something in the spec, a) it can actually be used in most cases where most people want to use it, and b) it be forwards-compatible with addressing remaining places AWB: I wouldn't be behind this if I didn't think both those things are true YK: I think those things are true; I am personally concerned that I will receive requests for things I can't implement AWB: I think people will have issues in the short-term but they'll eventually improve DH: rollout and adoption are important; YK, I'd like to know more specifically what you can do now that you can't do with maxmin RW: try/catch never would've happened if people said "I can't use this now"; in 3 years, I think you'll rebuild Ember with classes YK: here's why I know it's a real issue: people want Ember classes to work with CoffeeScript classes, and I have these exact issues AR: no one's suggesting that it's impossible to do more than maxmin -- even in ES6 time frame -- but we need to agree on base foundation DH: I think Luke's Ocaml syntax is too confusing, it doesn't win enough for convenience, and it's less mainstream BE: Ocaml! I knew it! LH: this clearly has limited amounts of support AR: we've been through this part of the design process and sympathize, but we've already come to conclusion it doesn't work MM: keeping scope simple is the most important thing IMO LH: but what I miss is the instance properties, the ability to see shape of the object is very valuable, e.g. for completion lists in tools AWB: if you have maxmin as starting point, there are ways to address it LH: the proposed extensions I've seen are awkward AR: this has been solved in other languages e.g. C++ initialization list LH: I won't stand in the way DH: pulse? LH: I think Waldemar was totally opposed, I'm not totally opposed, just had concerns DC: my concern: this isn't new capability; but if it doesn't address all the needs, it'll just add more confusion and won't help AWB: there's significant debate on a lot of things that could make things easier, but there's a basic structuring that is useful: it's useful not to have to wire up the prototypes properly DC: but a lot of people are doing this with libraries AR: but then they're not interoperable DC: I don't feel compelled to go forward with something we can't be sure isn't right YK: I see this being on the way to that, but not there yet BE: you want something, Waldemar wants something else... YK: I think it's ok if it's a subset, but we should be clear about that, so I can hold the line against using it DC: tactically I'm opposed to that; if we're on the way but not there, let's do nothing AR: I was in your camp some time ago. so much complexity to build a proposal that hangs well together that you will have to throw some use cases overboard. we have a long history of not doing classes on the basis that we should wait to do it right. but we also have a long history of iterating on and improving on things that are already in the language. I have faith that we will work to iterate DC: I'm saying we shouldn't ship until we're sure we're right DH: design does not have empirical or rationalistic methods of validation. all you can do is discuss, prototype, build examples, and try it out, all of which are part of the Harmony process DC: I'm fine with consensus BE: Waldemar wants the ability to have "final" or "sealed" objects that can have errors when mis-named properties, which Mark would like as well AR: I thought Waldemar wanted only that MM: I think this is a subset of the original class proposal from last spring that Waldemar was on board with; the restrictions were only on const classes BE: other than future-hostility concerns, which aren't falsifiable MM: can check whether this is forward-compatible with the May proposal AWB: that wasn't sound, it was just a whiteboard sketch BE: Mark's saying that we can grow in the direction of the things Waldemar wants AWB: I specifically asked Waldemar to say specifically what he thought this was interfering with, but he hasn't come back with anything; don't see why that couldn't be added with more syntax AR: ISTM there's a fight over defaults DH: if const is default, we will all be burned at the stake AR: can't we get something useful if we don't solve the const problem? BE: it's not clear to me people are asking for what Waldemar is asking for MM: let's not spend time guessing Waldemar's position BE: sure, let's just separate the default question from const question; can we agree const should not be default? MM: I think that was always the agreement since last May BE: I think when Waldemar returns, we may have to find a way to get past const, to achieve consensus AWB: I think we should move forward, start specifying maxmin, knowing that we can remove it later YK: I would like to make a stake in the ground MM: until Waldemar can make his case, this isn't consensus AR: there's some room here to suggest we can't wait forever EA: can we start saying let's build on this instead of waiting another two months? MM: let's not push until Waldemar has returned AR: OK, but we do have a deadline, and there's been plenty of time to register complains EA: I would just like to start prototyping AWB: and I would like to start doing spec work MM: sure, we should all push on all fronts, but we don't have consensus without Waldemar's agreement BE: correct

Topic: proto

OH: it's a getter/setter in JSC MM: one sane approach: refuses to change prototype of object born in another global; other sane approach: refuses to modify prototype of any object that inherits from that context's Object.prototype DH: what about the getter? MM: always works; equivalent to Object.getPrototypeOf AWB: exposing the setter as a function means that any method call with one argument could potentially be a proto modifier DH: can't you already do that with a function that delegates to .proto ? AWB: if you thought you deleted Object.prototype.proto but someone squirreled away a copy of the setter BE: I just don't like it MM: not fatal for security BE: we're inventing something new that's not been tested on some pure aesthetic of not wanting magic data properties. I just don't like it DH: when we already do that with array.length and indexed properties anyway! BE: this is a turd we do not want to polish MM: I don't like the way it's specified DH: sure. I think we agree about the behavior BE: agreement: it's in Annex B, it'll be a pseudo-data property MM: I don't think that's clear EA: I would pick an accessor BE: this isn't a clean-slate design experiment! MM: b/c Firefox doesn't implement the accessor? BE: b/c no one other than JSC very recently MM: we can analyze the security properties BE: this is not about security, it's about unintended consequences YK: is there an advantage to making it an accessor? MM: yes: the actual action is that it has a magic side effect BE: that's an aesthetic preference OH: the pre-accessor behavior of JSC was that every object instance had the magic property; from our point of view, being able to extract the accessor is in no way different from what you could already do BE: acid test: o.__proto__ = null -- what happens? OH: in old JSC, it remained magic DH: comes down to distaste of another magic data property vs distaste of a portable, well-specified proto-updater function YK: people will use it BE: and maybe someone will come up with some zero day, I can't predict that DH: it's strictly more risk for sake of a purely aesthetic reason, in an already aesthetically nasty space BE: yes, should be a risk analysis, not an aesthetic analysis YK: I agree MM: I would like a more modular specification than poisoning the semantics of [[Get]] etc; but think of how expensive the magic of array length has been BE: this is already shipping in a bunch of browsers, and in some of them it's already a magic property; the developer-facing cognitive load is a wash; developers just want it to work, they don't care whether it's an accessor DH: I can predict the security bugs: the implementor just thinks about the normal case, but the attacker takes the accessor out, installs it on an object that inherits from a proxy to an object from another global etc. etc. and something internal breaks MM: that's the most compelling argument I've heard. the additional testing surface area is much bigger DH: Arv? EA: I'm not gonna block this OH: I think it would be nice to have a mechanism that let you specify getters/setters that weren't extractable DH: actually proxies basically are such a mechanism AWB: I don't like getters/setters that can't be reflected

Topic: Spec terminology

AWB: confusion about "host object"; I'd like to eliminate concept of "host object" entirely, introduce new terminology

  • object: runtime entity w/ unique identity and exposes properties
  • mundane object: object that uses default behaviors for chapter 8 internal methods
  • exotic object: object that provides non-default behavior for at least one of the internal methods DH: I would say that "plain" is better than "mundane"; also value objects, were they to succeed in Harmony, don't have identity AWB: we can cross that bridge when we come to it DH: anyway I like this; makes clearer where the extension points of the language are AWB: note that proxies are exotic MM: are array instances exotic? AWB: yes DH: is that a problem? MM: no, just surprised me AWB: I'd like functions and other "normal" things to be classified as mundane DH: but arrays should still be exotic? AWB: yes DC: Ecma members can be "ordinary" AWB: ok, "ordinary" and "exotic" AWB: another dimension of classification?
  • standard object: defined by ES specification
  • built-in object: provided by ES implementation but not defined in ES specification
  • platform object: provided by host environment <<mass confusion>>

AWB: I agree these distinctions are unclear DH: I think the first dimension was great! it moves us in the direction of not needing the second dimension AWB: I agree, and I will see if it's possible to eliminate the second dimension AWB: now, a dimension for functions:

  • function: an object that exposes the [[Call]] internal method
  • ECMAScript function: function whose invocation result & side effects are provided by evaluating ECMAScript code
  • alien function: function whose invocation result & side effects are provided in some manner other than by evaluating ECMAScript code
  • standard function: function whose invocation result and side effects are defined by the ECMAScript specification (mostly ch 15) DH: why do we need this? job of implementation is to make it indistinguishable that a function is alien AWB: setting up initial environment has to keep track of variable environment etc BT: it's unobservable DH: you can't tell! LH: are you saying there's a flaw in the current ES5 spec? AWB: when you activate a chapter 15 function, you don't go through any of the normal process MM: Function.prototype.toString has requirements on functions that must be ECMAScript functions; so there's a distinction between "must-be" and "may-not-be" AWB: but Chapter 15 functions need access to the current global's intrinsics (e.g. for "initial meaning of new Array" etc) BE: you can get to that via [[Scope]] DH: I would rather eliminate the alien function distinction, and the steps in Chapter 15 can be implemented in pure ES AWB: I'll just have go through and see how many distinctions I can eliminate MM: what are the "intrinsics"? DH: things like where the spec says "as if via new Array for the initial value of Array"; loaders rationalize this, and each loader has its own set of intrinsics YK: "realm" is used in basic auth DH: "realm" is a little thesaurus-y; I like "context" but "ExecutionContext" used in ES already, so "global context" AWB: "context" doesn't feel big enough MM: so [[Scope]] gets you to the context? AWB: you can share global objects between loaders! DH: oh right, so [[Scope]] is not it -- needs its own [[Context]] AWB: top-level environment record gets you to it BE: don't have two independent fields that are required to agree DH: yeah, as long as we can get to the global context via the top-lex environment record, we should just get to it via that DH: popping the stack, I like "global context" YK: I like "global context" BE: "context" is almost meaningless MM: "shire"... lulz AWB: "island"? DH: <<winces uncomfortably>>

DH: "home"? AWB: not bad... already using for something else but I could rename it

Topic: do-expressions

DH: allows you to evaluate expressions with control flow or temporary variables:

a = do { let tmp = f(); tmp * tmp + 1 };

workaround:

let tmp; // scope pollution! a = (tmp = f(), tmp * tmp + 1);

workaround:

a = (function() { var tmp = f(); return tmp * tmp + 1; })()

YK: CoffeeScript gives you the latter DH: the IIFE doesn't "say what you mean" -- boo boilerplate YK: also, TCP hazards MM: strongest argument in favor is for code generators DH: well, I think the developer convenience is an even stronger argument EA: I want this most for code generators MM: strongest or not, it's still compelling for code generators DH: arguments against that I've heard:

  • eliminates the current structure of JS syntax where statements are always at outermost level of a function body
  • exposes the completion value of statements, which is only otherwise observable via eval YK: it's pretty clear what the result is DC: the "do" syntax will be confusing to those familiar with do-while YK: you can think of it as do-while-false DH: I think it reads really naturally as English: "do this" means do it once, "do this while blah" means do it some number of times DH: pulse of the room? <<mostly favorable>>

LH: my standard concern: the overall cost of so much new syntax; I would certainly use this, and I like it, but all these conveniences add up in cost DC: I don't think we ever consider the overall design DH: sick of that accusation; we constantly consider the overall design BE: there is an important point about the cost of syntax, and we need to take account of all of the syntax we've proposed AWB: worth making everyone come up with a priority list? BE: that's a little "fighty", but we will need to take stock and make some cuts at some point DH: I imagine I'd cut do-expressions before anything else EA: I want them more than comprehensions DH: yeah, I've championed comprehensions for so long I hadn't really reconsidered; I love them but I could see how you might prioritize do-expressions DH: I want to make a couple points about the big picture:

  • there is a meme that TC39 doesn't understand complexity budgets and is out of control
  • please do not spread this meme; it's false
  • one thing the recent controversy, including JSFixed, has shown me is that TC39 needs to improve our communication
  • there's a lot of misinformation, and a lot of unhappy people, when they start talking with us, end up finding out they're actually pretty happy with the direction ES6 is going in
  • I will be redoing the entire wiki to focus on community first, committee second, and to make the big picture clearer
  • but we all could think about ways to increase our communication with the community AWB: in some ways the anger is actually a consequence of being open DH: yes, but we can still improve our communication BE: we still need to think about whether and what we should scale back AR: people are resistent to change; once features land they'll use them DH: tension between needing to grow a language incrementally but about the standards process taking many years between revisions; either underserve users by simply not growing enough, or freak them out by landing many things at once YK: roll-out of individual features in browsers can help, even preffed off DH: just want to say that we do consider complexity budget, we have cut a ton of things, and the ES6 feature set has really strong coherence AR: go team!
# Waldemar Horwat (13 years ago)

Since you're speculating on my position, here it is:

  • Classes don't hang together unless we have agreement on some declarative way to specify properties, referred to as "const classes" in the meeting notes.
  • It's fine for that to not be the default, but we must have agreement on how to do it.

I'll be out of the office for a few days, but we can discuss it when I'm back.

 Waldemar
# Brandon Benvie (13 years ago)

The last discussion point there is really important I think. I get a strong sense of the general JS developer world feeling no connection at all to this process, and much can be put directly on the sheer timescale. We've seen 2 month browser version cycles come in force, the "living standard" that is HTML5 and the as of recent rapid movement of new APIs coming out to JS from not-TC39. People don't feel connected to the things in ES6 because they rightfully can't envision using them in an imaginable timeframe.

# Brendan Eich (13 years ago)

Play fair now. SpiderMonkey in Firefox prototyped let, const, generators, iterators (an earlier form), destructuring (close to final ES6), comprehensions, generator expressions, and more recently proxies, weak maps, maps, and sets. V8 joined in the proxies, weak maps, maps, and sets fun. The for-of loop from ES6 is in SpiderMonkey now. Most significantly, modules are under way in SpiderMonkey and V8.

We prototype as we go. V8 has kept stuff under a flag and that's fair

# Brandon Benvie (13 years ago)

It's not really my opinion. I spend all day playing with this stuff. But it's what I hear in ##javascript on freenode all the time and the general sentiment I see in blogs often enough. That the movement is slow enough, combined with cycle time of browser generations, that (from my observations) that people don't connect with how ES6 may benefit them because they never expect to use it, not within the forseeable future. Perhaps this is a misread on my part, it doesn't reflect what I personally think or do.

# Brendan Eich (13 years ago)

Ok, if you are just the messenger I won't shoot you ;-).

Of course, thanks to JSFixed/JSFixed we know you're relaying real messages, although noisy. I'm relying on @valueof, @rwaldron and others to sift out the signal.

Angst and exaggeration aside, there is a fair point here. JS is not evolving as fast as its libraries and compilers (hence Mozilla's research in Emscripten and LLJS). We need to do better, both on just communicating what we've already done (which often satisfies the angst-y folks), and in helping browsers show progress via prototypes that verge on de-facto and then de-jure standards.

Working on this, so is dherman (so are others).

# Domenic Denicola (13 years ago)

The prototyping efforts are appreciated, but can rarely be used in a comfortable way. (Compared to, say, HTML5.) I've thought a lot about how to feasibly use Harmony features in real-world code, but have fallen down every time. Here are some of the stumbling blocks I've encountered:

  • Node.js with --harmony flag gets you collections, old proxies, and (significantly) block scoping. But Node does not make it easy to indicate "this file needs to be run with --harmony," or e.g. to require harmony-using files from non-harmony-using libraries. So this ends up being a nonstarter for library authors, leaving it only usable by application writers. Besides, the proxies are still old, which is really unfortunate. And the iteration rate is slowww: stuff like destructuring has been harmonized for a long time, but shows no sign of making it into V8.

  • The same problems apply to desktop apps written with Chromium Embedded Framework. These will probably have more app code, but then if you want to factor any of it out into smaller redistributable modules, you limit your audience.

  • SpiderMonkey has a lot of stuff that we would love to use, and a fairly fast iteration time. (Direct proxies are almost landed, according to my bugmail!) The spec is tracked pretty well, too. But SpiderMonkey has very little uptake outside of Firefox, and most code written for Firefox must be web-compatible, so nobody except Firefox extension authors gets to use its many features.

  • Traceur seems to be coming along nicely, but its alignment with the spec leaves a lot to be desired. Destructuring just got fixed a few days ago, and they have a class syntax you have to avoid to write ES6-as-it-is--compatible code. They have features like private names that are not (really) in the spec yet, using speculative syntax (new Name was never harmonized, was it?). Monocle-mustache is in, somehow. Etc. It's ES6-like, but doesn't reflect TC39's actual progress, instead reflecting the Traceur team's interpretation of where TC39 might eventually, hopefully, end up. Subsetting might be a solution, assuming the semantics of the subsetted features are aligned with the spec, but determining that would require further investigation.

  • And of course there's the usual elephant in the room, viz. Internet Explorer. We (Barnes & Noble.com) reached out to Microsoft about including some basic, already-harmonized features in IE10 for use in writing desktop apps in HTML5/JS as per their new Windows Runtime framework. (This was desired since we are using some of them in our existing Chromium-based desktop app, namely block scoping and weak maps.) But even as a "first class partner" (or something), they were unable to grant us this request. The attitude that I personally inferred was that ES6 won't make it into IE before a finalized spec is available. This not only stalls web progress, but also that of anything that embeds Chakra (a fairly common operation on Windows, about to become much more common with Windows 8).

# Brendan Eich (13 years ago)

Domenic Denicola wrote:

The prototyping efforts are appreciated, but can rarely be used in a comfortable way. (Compared to, say, HTML5.)

Remember, HTML5 started in 2004 (WHATWG founding) and still isn't done. Eight years ago.

I've thought a lot about how to feasibly use Harmony features in real-world code, but have fallen down every time. Here are some of the stumbling blocks I've encountered:

  • Node.js with --harmony flag gets you collections, old proxies, and (significantly) block scoping. But Node does not make it easy to indicate "this file needs to be run with --harmony," or e.g. to require harmony-using files from non-harmony-using libraries. So this ends up being a nonstarter for library authors, leaving it only usable by application writers. Besides, the proxies are still old, which is really unfortunate.

That'll be fixed this year, soon I'm told.

And the iteration rate is slowww: stuff like destructuring has been harmonized for a long time, but shows no sign of making it into V8.

How would you see the signs? Just asking. I know the Munich team is going strong and they have skills. I don't know detailed schedule, but there is no need to presume inaction or action. Let's ask.

  • The same problems apply to desktop apps written with Chromium Embedded Framework. These will probably have more app code, but then if you want to factor any of it out into smaller redistributable modules, you limit your audience.

  • SpiderMonkey has a lot of stuff that we would love to use, and a fairly fast iteration time. (Direct proxies are almost landed, according to my bugmail!) The spec is tracked pretty well, too. But SpiderMonkey has very little uptake outside of Firefox, and most code written for Firefox must be web-compatible, so nobody except Firefox extension authors gets to use its many features.

This is all true now, but with more balanced browser competition and IE10 coming along, next year is a different story.

If there were a way to make this go faster, I'd want it. I don't know of a way. Do you? Again, HTML5 isn't done, isn't totally cross-browser-consistent, and started 8 years ago.

Compilers do seem attractive, not only for ES6-prototyped to ES5, but for even more advanced/experimental languages (LLJS, for example). CoffeeScript is quite usable, too.

# Rick Waldron (13 years ago)

On Wednesday, May 23, 2012 at 10:08 PM, Domenic Denicola wrote:

The prototyping efforts are appreciated, but can rarely be used in a comfortable way. (Compared to, say, HTML5.) I've thought a lot about how to feasibly use Harmony features in real-world code, but have fallen down every time. Here are some of the stumbling blocks I've encountered:

  • Node.js with --harmony flag gets you collections, old proxies, and (significantly) block scoping. But Node does not make it easy to indicate "this file needs to be run with --harmony," or e.g. to require harmony-using files from non-harmony-using libraries. So this ends up being a nonstarter for library authors, leaving it only usable by application writers. Besides, the proxies are still old, which is really unfortunate. And the iteration rate is slowww: stuff like destructuring has been harmonized for a long time, but shows no sign of making it into V8.

  • The same problems apply to desktop apps written with Chromium Embedded Framework. These will probably have more app code, but then if you want to factor any of it out into smaller redistributable modules, you limit your audience.

  • SpiderMonkey has a lot of stuff that we would love to use, and a fairly fast iteration time. (Direct proxies are almost landed, according to my bugmail!) The spec is tracked pretty well, too. But SpiderMonkey has very little uptake outside of Firefox, and most code written for Firefox must be web-compatible, so nobody except Firefox extension authors gets to use its many features.

  • Traceur seems to be coming along nicely, but its alignment with the spec leaves a lot to be desired. Destructuring just got fixed a few days ago, and they have a class syntax you have to avoid to write ES6-as-it-is--compatible code.

I just used Traceur 2 days ago and wrote max/min style classes.

They have features like private names that are not (really) in the spec yet, using speculative syntax (new Name was never harmonized, was it?). Monocle-mustache is in, somehow.

There is a lot of support and interest for the mustache in TC39

Etc. It's ES6-like, but doesn't reflect TC39's actual progress, instead reflecting the Traceur team's interpretation of where TC39 might eventually, hopefully, end up.

I don't think this is a fair claim to make, considering Traceur had no real attention for quite some time and Erik Arvidsson really stepped it up to get a nice cross section of features prototyped in a short amount of time. I think what you meant to say was "thank you".

Subsetting might be a solution, assuming the semantics of the subsetted features are aligned with the spec, but determining that would require further investigation.

  • And of course there's the usual elephant in the room, viz. Internet Explorer. We (Barnes & Noble.com) reached out to Microsoft about including some basic, already-harmonized features in IE10 for use in writing desktop apps in HTML5/JS as per their new Windows Runtime framework. (This was desired since we are using some of them in our existing Chromium-based desktop app, namely block scoping and weak maps.) But even as a "first class partner" (or something), they were unable to grant us this request. The attitude that I personally inferred was that ES6 won't make it into IE before a finalized spec is available. This not only stalls web progress, but also that of anything that embeds Chakra (a fairly common operation on Windows, about to become much more common with Windows 8).

Blaming TC39 for IE's slowness is about as appropriate as blaming me for the Boston molasses disaster.

Some feedback about your feedback: I'm wondering if you have any thoughts about the actual minutes?

# Domenic Denicola (13 years ago)

You are of course right, Brendan, and thanks for addressing my points. Certainly part of this was fueled more by frustration than informed knowledge (see below).

-----Original Message----- From: Brendan Eich [mailto:brendan at mozilla.com]

The prototyping efforts are appreciated, but can rarely be used in a comfortable way. (Compared to, say, HTML5.)

Remember, HTML5 started in 2004 (WHATWG founding) and still isn't done. Eight years ago.

Fair point. I think I and others would be reassured if we had some analogies drawn with the HTML5 timescale: e.g. how long from <article>, <section> etc. until they got included in default browser stylesheets, or how long from <input type="date"> to browser UIs. (That last one is actually a pretty good example of slow implementation progress; I feel a lot better about ES6's progress when I think about how we are just now getting a UI for <input type="date"> in Chrome.)

  • Node.js with --harmony flag gets you collections, old proxies, and (significantly) block scoping. But Node does not make it easy to indicate "this file needs to be run with --harmony," or e.g. to require harmony- using files from non-harmony-using libraries. So this ends up being a nonstarter for library authors, leaving it only usable by application writers. Besides, the proxies are still old, which is really unfortunate.

That'll be fixed this year, soon I'm told.

The proxies, or Node.js? In either case, great news! And in either case, how did you learn about this? I try to stay up to date on all relevant blogs, Twitter feeds, mailing lists, etc., but certainly could have missed one. If it was private communication, perhaps in the interest of helping developers see ES6 progress (as per Brandon's original point) such things could be more publicized in the future? I imagine that's complicated though.

And the iteration rate is slowww: stuff like destructuring has been harmonized for a long time, but shows no sign of making it into V8.

How would you see the signs? Just asking. I know the Munich team is going strong and they have skills. I don't know detailed schedule, but there is no need to presume inaction or action. Let's ask.

I guess I can't really expect to see all the signs through public channels, as above. Still, I think opening such channels would be a valuable thing for the community.

# John J Barton (13 years ago)

On Wed, May 23, 2012 at 10:08 PM, Domenic Denicola <domenic at domenicdenicola.com> wrote:

  • Traceur seems to be coming along nicely, but its alignment with the spec leaves a lot to be desired. Destructuring just got fixed a few days ago, and they have a class syntax you have to avoid to write ES6-as-it-is--compatible code. They have features like private names that are not (really) in the spec yet, using speculative syntax (new Name was never harmonized, was it?). Monocle-mustache is in, somehow. Etc. It's ES6-like, but doesn't reflect TC39's actual progress, instead reflecting the Traceur team's interpretation of where TC39 might eventually, hopefully, end up. Subsetting might be a solution, assuming the semantics of the subsetted features are aligned with the spec, but determining that would require further investigation.

Each feature in Traceur is individually controlled by options, so you can use your own judgement and turn off any feature you don't believe will make it in to ES-6. Traceur is open source and your contributions would be welcomed. It's pretty easy to work with.

jjb

# Domenic Denicola (13 years ago)

From: Rick Waldron [mailto:waldron.rick at gmail.com]

Etc. It's ES6-like, but doesn't reflect TC39's actual progress, instead reflecting the Traceur team's interpretation of where TC39 might eventually, hopefully, end up.

I don't think this is a fair claim to make, considering Traceur had no real attention for quite some time and Erik Arvidsson really stepped it up to get a nice cross section of features prototyped in a short amount of time. I think what you meant to say was "thank you".

Indeed you're right, that is what I meant to say. Apologies to Erik, and thanks to JJB for pointing out Traceur is much better suited for my purposes than I thought. I will definitely give it another try.

# David Bruant (13 years ago)

Le 24/05/2012 07:13, Brendan Eich a écrit :

Compilers do seem attractive, not only for ES6-prototyped to ES5, but for even more advanced/experimental languages (LLJS, for example). CoffeeScript is quite usable, too.

Source map makes this even easier and smoother. In his JSConf 2012 presentation, Paul Irish mentionned that source map could work with Traceur. To keep in mind.

# Aymeric Vitte (13 years ago)

Le 23/05/2012 22:30, David Herman a écrit :

  • I will be redoing the entire wiki to focus on community first, committee second, and to make the big picture clearer

I know writters don't have always time but maybe adding more examples and simple ones (even not reflecting the whole picture) in the strawmans would help a lot. For example the fat arrow strawman is very condensed for what it brings and it took some time before I start understanding what the soft binding strawman was about. Example of simple examples (hope this is correct) :

Soft binding :

var o={ msg:'I am o', log:console.log(this.msg) }

var o2={ msg:'I am o2' }

o2.log=o.log;

o2.log(); //"I am o" //without soft binding it should be "I am o2"

Lexical this (and dynamic this) :

var test='aaa';

var o={test:'bbb'};

var o2={test:'ccc'};

//without lexical this o.m=function() {var self=this; var func=function() {console.log(this.test+' '+self)};func()};

var f=o.m;

o2.m=o.m;

o.m(); //aaa bbb //this.test is not o.test

o2.m(); //aaa ccc //this.test is not o2.test

f(); //aaa aaa

//with lexical this o.m=function() {var self=this; var func=()=>console.log(this.test+' '+self);func()}

var f=o.m;

o2.m=o.m;

o.m(); //bbb bbb //this.test is o.test //lexical |this| is associated to dynamic |this| outside func //you don't need the var self=this statement

o2.m(); //ccc ccc //same as above

f(); //aaa aaa

//other example o.m=()=>{console.log(this.test)};

var f=o.m;

o2.m=o.m;

o.m(); //bbb

f(); //bbb //should be aaa if no lexical this

o2.m(); //bbb //should be ccc if no lexical this

# Andreas Rossberg (13 years ago)

On 24 May 2012 07:13, Brendan Eich <brendan at mozilla.com> wrote:

Domenic Denicola wrote:

And the iteration rate is slowww: stuff like destructuring has been harmonized for a long time, but shows no sign of making it into V8.

How would you see the signs? Just asking. I know the Munich team is going strong and they have skills. I don't know detailed schedule, but there is no need to presume inaction or action. Let's ask.

Indeed. The V8 team is committed to Harmony. But obviously, it is not the only thing we are working on, and it is a significant amount of work that will take time. The highest prority currently are modules, which are more important than, say, destructuring, because they are less well understood and having the implementation experience is important for the spec as well. There is no concrete roadmap for other features, but destructuring is relatively high on the list.

For a variety of reasons, we will not remove the harmony flag for features before they have proved stable enough. On the web, a bug in the implementation of something as intrusive as proxies can be a serious issue, for example. On the other hand, there is no reason why `node' cannot build/run V8 with the flag turned on by default, if they wish to (or the users pressure them ;) ).

# Brandon Benvie (13 years ago)

Proxies have a huge surface area to them, not to mention a whole new updated specification with some significant core differences, so I don't blame you for that. On the other hand WeakMaps have a tiny surface area are super useful... ;)

# Allen Wirfs-Brock (13 years ago)

On May 29, 2012, at 7:19 AM, Brandon Benvie wrote:

Proxies have a huge surface area to them, not to mention a whole new updated specification with some significant core differences, so I don't blame you for that. On the other hand WeakMaps have a tiny surface area are super useful... ;)

WeakMaps require a major change to any implementations garbage collector....

# Brandon Benvie (13 years ago)

Right, but I guess I tend to automatically look at it in terms of what the amount of exposure is to JSland for guidance in assessing what the magnitude of implementation difficulties will be. I imagine the ways something like the WeakMap gc changes break are ones that are mostly areas where they just don't work efficiently, or don't work at all. Where the ways proxies break ends up with all sorts of crazy elevated access holes.

Although...Firefox sure seems to be on my case a hell of a lot about failing to preserve wrappers for wrapper native weak map keys.

# Allen Wirfs-Brock (13 years ago)

On May 29, 2012, at 7:31 AM, Brandon Benvie wrote:

Right, but I guess I tend to automatically look at it in terms of what the amount of exposure is to JSland for guidance in assessing what the magnitude of implementation difficulties will be. I imagine the ways something like the WeakMap gc changes break are ones that are mostly areas where they just don't work efficiently, or don't work at all. Where the ways proxies break ends up with all sorts of crazy elevated access holes.

The exposure surface of most GC changes is all of JSland...

GC bugs are among the hardest to find. GC code is generally "unsafe", GCs operate globally upon the entire heap, it may modify the content of the stack, any object, any internal or user data structure, the noticeable effect of a GC bug is often has significant temporal lag before it manifests , there may be significant non-deterministic behavior in the user program that triggers a GC bug making hard to replicate,