Mathias Bynens (2013-08-24T09:02:54.000Z)
domenic at domenicdenicola.com (2013-08-29T19:39:00.229Z)
On 27 Feb 2012, at 22:58, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote: > This is something that I think can be clarified for the ES6 specification, independent of the on-going discussion of the possibility of 21-bit string elements. My preference for the future is to simply define the input alphabet of ECMAScript as all Unicode characters independent of actual encoding. That sounds nice. > var \ud87e\udc00 would probably still be illegal because each \uXXXX define a separate character but: var \u{2f800} =42; schould be find as should the direct none escaped occurrence of that characters. Wouldn’t this be confusing, though? global['\u{2F800}'] = 42; // would work (compatible with ES5 behavior) global['\uD87E\uDC00'] = 42; // would work, too, since `'\uD87E\uDC00' == '\u{2F800}'` (compatible with ES5 behavior) var \uD87E\uDC00 = 42; // would fail (compatible with ES5 behavior) var \u{2F800} = 42; // would work (as per your comment; incompatible with ES5 behavior) var 丽 = 42; // would work (as per your comment; incompatible with ES5 behavior) Using astral symbols in identifiers would be backwards incompatible, even if the raw (unescaped) symbol is used. There’d be no way to use such an identifier in an ES5 environment. Is this a problem?