Removal of language features

# David White (7 years ago)

I’m just curious as there are a lot of proposals for the addition of features to the ES specification, is there any scope for requests to remove language features? Going via the same means of writing a proposition that would attempt to support the removal of a feature, ultimately simplifying the specification and language?

# kai zhu (7 years ago)

+1

# Isiah Meadows (7 years ago)

It's mostly a TC39 process, where they have to go through a very pain-staking process to ensure it breaks virtually nothing in real world code, one that takes several years. So far, the only language feature successfully removed is arguments.caller. There are a few others deprecated for future removal:

  • Function.prototype.caller
  • arguments.callee
  • RegExp.$1, RegExp.global, and friends
  • Most everything banned from strict mode.
  • And likely others.
# Andrea Giammarchi (7 years ago)

FWIW I can tell you already removing RegExp.$1, RegExp.global, and friends will break a lot of things.

Testing a RegExp to instantly using its matches via $1 and friends has been used in many scripts.

# Michael Kriegel (7 years ago)

Removing language features will introduce breaking changes. Removal is more like an agreement, not to use certain features, so it is the task of a Linter, which can check, that you do not use "bad things". Strict mode is another way, which disables a certain set of features, such that it becomes an error, when you use them. However, it is not very fine granular.

Maybe ES should introduce something like a configuration object, in which developers can enable / disable or in general configure features for their module in a fine granular way. Of course all features which exist at the time of introducing such a mechanism would have to be enabled by default...

# T.J. Crowder (7 years ago)

On Thu, Jul 20, 2017 at 7:13 AM, Michael Kriegel < michael.kriegel at actifsource.com> wrote:

Maybe ES should introduce something like a configuration object, in which developers can enable / disable or in general configure features for their module in a fine granular way. Of course all features which exist at the time of introducing such a mechanism would have to be enabled by default...

As you say, this is more a job for linters / code analysis / code quality tools. The only time it's relevant to the JavaScript engine itself is where the elimination of a feature allows the engine to do a better/faster optimization job (for instance in strict mode, with, the link between arguments and named parameters, ...) or where only the engine can know about something problematic (viz. automatic globals). Those are relatively rare.

I can't recall details, but I seem to recall an indication on the list at some stage that there's little-to-no appetite for further "use strict"-like directives. It's a big hammer, used once so far to fix some long-standing performance and correctness pain-points. Maybe TC-39 will use it again, but the indication was, not any time soon, and certainly not just to turn off features people don't like.

-- T.J. Crowder

# Alexander Jones (7 years ago)

Removing things also frees up syntax and names for future extensions. Never removing features is simply unscalable, and it's only going to accelerate JS's demise!

I still think version pragmas are probably worth exploring to mitigate this, while not 'breaking the web' is a stated goal.

Alex

# Mike Samuel (7 years ago)

On Thu, Jul 20, 2017 at 2:13 AM, Michael Kriegel <michael.kriegel at actifsource.com> wrote:

Removing language features will introduce breaking changes. Removal is more like an agreement, not to use certain features, so it is the task of a Linter, which can check, that you do not use "bad things". Strict mode is another way, which disables a certain set of features, such that it becomes an error, when you use them. However, it is not very fine granular.

Linters help guide good-faith developers towards the good parts(tm) of the language. Keeping all the bad parts of the language around does impose an ongoing cost though.

Obscure implementation-specific extensions and ancient rarely used features can still be rediscovered and used to break out of sandboxes. It's also the old, irregularly maintained code paths that are most likely to allow an attacker to attack underlying layers -- buffer overflows against C code via neglected IDL binding code.

As someone who has to keep current on the bad parts, I would appreciate if there were fewer of them. Old code that has legitimate reasons to use ancient features was also written and tested when machines (modulo battery-constrained devices) were slower. If old misfeatures can't be eliminated, I would prefer that old features be reimplemented in terms of newer features where possible even if that imposes a significant performance penalty since that results in fewer code paths close to the metal that are rarely tested openly.

Maybe ES should introduce something like a configuration object, in which developers can enable / disable or in general configure features for their module in a fine granular way. Of course all features which exist at the time of introducing such a mechanism would have to be enabled by default...

This could reduce the attack surface for code injection attacks like XSS if done at the realm level. Expanding the configuration space also has security consequences though since you now have to test against all possible configurations or make assumptions about likely or supported configurations.

# Steve Fink (7 years ago)

On 07/20/2017 03:08 AM, Alexander Jones wrote:

Removing things also frees up syntax and names for future extensions. Never removing features is simply unscalable, and it's only going to accelerate JS's demise!

I still think version pragmas are probably worth exploring to mitigate this, while not 'breaking the web' is a stated goal.

Not breaking the web is a stated goal.

A non-web embedding is free to remove whatever is unwanted, but the vast majority of resources are put into JS engines that run the web, so in practice you're going to run on engines that implement the whole spec and are not at all eager to pay for testing or maintenance overhead for non-web configurations.

Features can sometimes be usefully deprecated by not implementing them in optimizing JITs. (For example, 'with'.) Whatever invariants the features break must still be handled by the overall engine, but it's usually much easier to only handle things on slow paths.

Version pragmas have been explored and found to be a bad idea. See exploringjs.com/es6/ch_one-javascript.html for a far better description than I would be able to produce.

# kai zhu (7 years ago)

pragmas although not great and should be used sparingly, are still the most feasible way forward to limiting excessive features. that being said, es6 is such a different beast from es5, that i think a backwards-compatible “use es5” text pragma would be appreciated by the significant number of veteran frontend-programmers and established companies who still prefer writing and maintaining code in the “legacy” language-style.

this would effectively be a one-off pragma, as its unlikely future es versions would have language changes of such magnitude to warrant it.

# T.J. Crowder (7 years ago)

On Fri, Jul 21, 2017 at 9:37 PM, kai zhu <kaizhu256 at gmail.com> wrote:

that being said, es6 is such a different beast from es5, that i think a backwards-compatible “use es5” text pragma would be appreciated by the significant number of veteran frontend- programmers and established companies who still prefer writing and maintaining code in the “legacy” language-style

Can you produce any data at all to back that up? I've never seen any appetite in that regard at all.

You do realize, presumably, that there is nothing preventing you from writing ES5 code and running it on current engines -- since ES2015, ES2016, and ES2017 are all backward-compatible with ES5. By design.

Don't like const? Don't use it. Don't like arrow functions? Don't use them. Don't like async/await? (You guessed it.) Don't use them. Nothing other than unspecified behavior is any different unless you use it. (And even in regard to unspecified behavior [I'm thinking function declarations in blocks here], the committee has bent over backward to do their best to avoid imposing changes on it.)

-- T.J. Crowder

# kai zhu (7 years ago)

Can you produce any data at all to back that up? I've never seen any appetite in that regard at all.

no hard data admittedly. i regularly attend tech meetups in hong kong. at these gatherings, the general sentiment from frontend developers is that creating webapps has gotten considerably more difficult with the newish technologies. even the local presenters for react and angular2+ at these talks can’t hide their lack of enthusiasm for these frameworks (like they’re doing it mainly to hustle more side-business for themselves). on-the-job, we all generally try to avoid taking on such technical risks, until we are inevitably asked to by our manager. the ones i see who are enthusiastic are typically non-frontend-engineers who write mostly backend nodejs code, that isn’t all that more scalable or interesting with what you can do in java/c#/ruby.

Don't like const? Don't use it. Don't like arrow functions? Don't use them. Don't like async/await? (You guessed it.) Don't use them. Nothing other than unspecified behavior is any different unless you use it. (And even in regard to unspecified behavior [I'm thinking function declarations in blocks here], the committee has bent over backward to do their best to avoid imposing changes on it.)

this brings up the point that frontend developers have no say in these matters when a less-than-technical pm asks them to use frameworks that all but uses these features. there are many of these managers in asia who copy whatever they perceive is trending in silicon valley, with very little care about the technical-risks they bring to projects.

# Isiah Meadows (7 years ago)

Inline.

On Fri, Jul 21, 2017 at 6:00 PM, kai zhu <kaizhu256 at gmail.com> wrote:

Can you produce any data at all to back that up? I've never seen any appetite in that regard at all.

no hard data admittedly. i regularly attend tech meetups in hong kong. at these gatherings, the general sentiment from frontend developers is that creating webapps has gotten considerably more difficult with the newish technologies. even the local presenters for react and angular2+ at these talks can’t hide their lack of enthusiasm for these frameworks (like they’re doing it mainly to hustle more side-business for themselves). on-the-job, we all generally try to avoid taking on such technical risks, until we are inevitably asked to by our manager. the ones i see who are enthusiastic are typically non-frontend-engineers who write mostly backend nodejs code, that isn’t all that more scalable or interesting with what you can do in java/c#/ruby.

Don't like const? Don't use it. Don't like arrow functions? Don't use them. Don't like async/await? (You guessed it.) Don't use them. Nothing other than unspecified behavior is any different unless you use it. (And even in regard to unspecified behavior [I'm thinking function declarations in blocks here], the committee has bent over backward to do their best to avoid imposing changes on it.)

this brings up the point that frontend developers have no say in these matters when a less-than-technical pm asks them to use frameworks that all but uses these features. there are many of these managers in asia who copy whatever they perceive is trending in silicon valley, with very little care about the technical-risks they bring to projects.

Most of these flashy frameworks (e.g. Angular, React, and Aurelia) encourage use of non-standard and/or unstable language features, even though they shouldn't, and it's a horrible idea to even recommend anything not at least stage 3 for general framework usage. (Note: I'm an active core developer for Mithril, a front-end framework whose community generally rejects the idea of embracing the latest and greatest without good pragmatic reason.) Decorators are stage 2, and the runtime API is still very much under flux. JSX is non-standard and Facebook has stated they don't intend on promoting it to become standard in any way.

Conversely, a large number of TC39 people have to deal with large legacy projects and older browsers, and most of them work in large companies with very large, brittle code bases where introducing a new tool is often extremely costly, no matter what it is. Babel isn't an option for many of them, and even Rollup (ES modules) and Bublé (majority of useful ES6 language features) are difficult to introduce.

To draw a comparison, consider the structure of ESLint (100k+ lines of pure, unprocessed ES6 minus modules) vs Babel (100k+ lines of Flow + JS with numerous transforms applied) and Angular (100k+ lines of TypeScript with some unstable features enabled). One is built from pure JS and has had no issues with language stability, the second with several non-standard additions and has had multiple wide-reaching refactorizations, and the third using decorators in both its API and implementation, whose spec's API went into significant flux around the time it was ready to go stable. (And BTW, Babel itself pulled decorator support because of the feature's instability.)


Isiah Meadows me at isiahmeadows.com

Looking for web consulting? Or a new website? Send me an email and we can get started. www.isiahmeadows.com

# Bob Myers (7 years ago)

Let me weigh in as an Angular programmer.

Angular does not "encourage use of non-standard and/or unstable language features". It encourages (indeed for all practical purposes mandates) the use of TypeScript, which like it or not is a perfectly well-defined language.

The TypeScript designers walk a fine line between providing features that the user base needs and wants, and getting too far out on the cutting edge. IMHO, they have done a very good job threading this needle. They religiously avoid adding features which are unstable or perceived as likely to change. The TS discussion list is replete with requests for adding this that or the other stage-0 feature, but such requests are invariably rejected.

I unfortunately sometimes get the impressions that the TC39 school views the TypeScript guys as the crazy uncle beating off over in the corner, while they define the "real" language at their own pace. Meanwhile, thousands of applications involving millions of lines of code are being developed in TS at high productivity levels, thanks not only to the language features but the extraordinarily good tooling and ecosystem. It would be a huge mistake to think that TypeScript is merely the CoffeeScript of our time.

With regard to the questionable comments in the post to which Isiah was responding, no, platform and toolchain decisions are not made by ignorant, copy-cat Asians. Nor are they usually made by non-technical PMs or other people in suits, other than in the form of approving recommendations made by forward-looking engineering management at the software or product company which is outsourcing its development. If you are losing mind-share with those folks, it's not their problem, it's your problem. Our engineering managers have been through countless iterations of the platform wars over the years, and usually have a finely honed sense of risk/benefit. The risk/benefit equation for TypeScript is overwhelmingly positive.

My two cents.

Bob

# kdex (7 years ago)

Inline.

On Saturday, July 22, 2017 7:47:32 AM CEST Bob Myers wrote:

Let me weigh in as an Angular programmer.

Angular does not "encourage use of non-standard and/or unstable language features". It encourages (indeed for all practical purposes mandates) the use of TypeScript, which like it or not is a perfectly well-defined language.

The TypeScript designers walk a fine line between providing features that the user base needs and wants, and getting too far out on the cutting edge. IMHO, they have done a very good job threading this needle. They religiously avoid adding features which are unstable or perceived as likely to change. The TS discussion list is replete with requests for adding this that or the other stage-0 feature, but such requests are invariably rejected.

With regard to non-standard and/or unstable features:

Examples for cases where TypeScript has actively harmed the ECMAScript ecosystem would be every instance where it was decided to mess with reserved words. enum and interface would be just two such examples.

In fact, this has recently come up in the ecmascript-interfaces-proposal [1], which led to the (IMHO absurd) idea of introducing interface under the keyword protocol, just so that ECMAScript and TypeScript play nice together.

Compatibility is nice, but ECMAScript can not be holding the bag for every "compiles to ES" type of language out there, especially if those languages decide to gamble with syntax and semantics when it comes to reserved words. That's really their own fault.

CoffeeScript didn't make the mistake of claiming to be a superset of ECMAScript at any time. TypeScript did. Consequently, there must either incompatibilities between the two languages, or TypeScript will have to be versioned and made backwards-incompatible to follow ECMAScript. Both of which is equally horrendous and far from what I would still call "perfectly well- defined".

[1] michaelficarra/ecmascript-interfaces-proposal#3

# Andrea Giammarchi (7 years ago)

My 2 cents

TypeScript, which like it or not is a perfectly well-defined language.

TS these days is a broken, diverging, branch of latest ECMAScript features, IMO.

class List extends Array { }

console.log(new List instanceof List);
// false .. seriously

Try it yourself. [1]

You can also argue that type, enum, interface, and private are also not aligned with ECMAScript so as much as "well-defined language" as it is, it shouldn't compromise any future development of an already "well-defined language" too which is JavaScript.

Best

[1] www.typescriptlang.org/play/#src=class List extends Array { } console.log(new List instanceof List)%3B %2F%2F false

# T.J. Crowder (7 years ago)

On Sat, Jul 22, 2017 at 3:17 PM, Andrea Giammarchi < andrea.giammarchi at gmail.com> wrote:

class List extends Array { }

console.log(new List instanceof List);
// false .. seriously

Try it yourself. [1]

I don't have a horse in the TypeScript race, but that example is unfair: The playground targets ES5 by default. As you know, you can't correctly subclass Array with ES5 features. To get the expected instanceof result from that code in ES5-land, TypeScript would have to replace uses of instanceof with something of its own, sacrificing efficiency for a questionable gain (instanceof usually smells anyway). (It is unfortunate that www.typescriptlang.org/docs/handbook/classes.html doesn't mention this.)

But ES5 output is just an option; if you tell TypeScript to output ES2015 code instead (tsc --target ES2015 example.ts), you get the expected result. Note that Babel targeting ES5 output also outputs "false": goo.gl/aJuQjV (that's just a shortened version of the Babel link resulting from pasting the code above into babeljs.io/repl)

If necessary, let's have a reasonable discussion of whether TypeScript's co-opting of keywords and syntax has a negative effect on the evolution of JavaScript (if that's a useful conversation to have), but let's not blame TypeScript for ES5's deficiencies, particularly not ones that were fixed more than two years ago.

-- T.J. Crowder

# Andrea Giammarchi (7 years ago)

My point was that ES classes work as spec'd TypeScript classes are broken without warning.

It's also not a matter of smelly instanceof, the problem is that if you add a method to that list class it won't be there. The returned instance/object won't have anything to do with the List class so classes in the well defined TypeScript language are not even close to native JS.

Babel bug is well known and I've proposed already a fix and I've also opened the currently opened bug about it.

Although Babel doesn't claim to be a well defined language, it's just a transpiler, hence my nitpick on the TS description.

About frameworks or Angular, version 1 was based on eval and I think frameworks should have an OK from TC39 before being considered influential .

TBH, I'd love to have an official review channel for hacky practices used on frameworks, no matter how big is their company size, and flag officially as TC39 friendly.

No idea if there are resources to do that, though, so I'm just speaking out loudly.

# Mike Samuel (7 years ago)

On Jul 22, 2017 11:14 AM, "Andrea Giammarchi" <andrea.giammarchi at gmail.com>

wrote:

About frameworks or Angular, version 1 was based on eval and I think frameworks should have an OK from TC39 before being considered influential .

What problems would this address? I would prefer almost any other solution to taking "official" positions on frameworks or libraries.

# Maggie Pint (7 years ago)

Having been a delegate to tc39 from the JS foundation for only about six months, I can't claim authority to speak to all of the history here. That said, I can very assuredly tell you there is a significant amount of respect for every technology that has been mentioned in this thread from most of the committee. In general, the committee sees any tool with significant adoption as an opportunity to learn/draw ideas from, not a plague.

Certainly, all delegates have their own opinions on various things. That said, IMO you wouldn't see any interest in policing libraries and frameworks from the committee. This is in conflict with the extensible web manifesto that most of the committee holds quite dear.

# kai zhu (7 years ago)

On Jul 22, 2017, at 11:37 PM, Maggie Pint <maggiepint at gmail.com> wrote:

Having been a delegate to tc39 from the JS foundation for only about six months, I can't claim authority to speak to all of the history here. That said, I can very assuredly tell you there is a significant amount of respect for every technology that has been mentioned in this thread from most of the committee. In general, the committee sees any tool with significant adoption as an opportunity to learn/draw ideas from, not a plague.

tc39 should be a bit more assholish imo. frontend-development (at least in asia) is in a zombie-state with few really understanding the technical-risks of the latest technologies and practices. this mess is partly due to no-one in the committee having the foresight and courage to gatekeep es6 to a manageable number of features.

# Andrea Giammarchi (7 years ago)

answering to all questions here:

What problems would this address?

It will give developers a clear indication of what's good and future proof and what's not so cool.

MooTools and Prototype extending natives in all ways didn't translate into "cool, people like these methods, let's put them on specs" ... we all know the story.

Having bad practices promoted as "cool stuff" is not a great way to move the web forward, which AFAIK is part of the manifesto too.

In general, the committee sees any tool with significant adoption as an

opportunity to learn/draw ideas from, not a plague.

That's the ideal situation, reality is that there are so many Stage 0 proposals instantly adopted by many that have been discarded by TC39.

This spans to other standards like W3C or WHATWG, see Custom Elements builtin extends as clear example of what I mean.

Committee might have the right opinion even about proposed standards, not even developers experimenting, so as much I believe what you stated is true, I'm not sure that's actually what happens. There are more things to consider than hype, and thanks gosh it's like that.

you wouldn't see any interest in policing libraries and frameworks from

the committee

agreed, because policing is a strong term. "TC39 friendly" is what I was thinking of, something like any other GitHub badge when it comes to code coverage, coding style, or target engines.

I'm a pioneer of any sort of hacks myself, but I know that if my new library is fully implemented thanks to eval and global prototype pollution, through transpiled code that uses reserved words, probably nobody beside me inside my crazy-lab should use my own code, no matter how much I promote it.

This is in conflict with the extensible web manifesto

The situation I've just described would be indeed against the web manifesto, wouldn't it?

tc39 should be a bit more assholish imo.

No it shouldn't, it should be open minded and able to listen too. However, when TC39 makes a decision the JS community follows quite religiously that decision.

If TC39 says everything is fine, you have today situation you describe.

If TC39 would give some little extra direction, you'd have people thinking about what they're using daily, example:

statement: TC39 considers Stage 1 unstable and it should never be used on production. result: people using early transpilers cannot complain about anything about it and it's their choice.

statement: TC39 consider the usage of eval inappropriate for production result: people using any library fully based on eval or Function would start looking for better options

And so on, I hope my previous email is now cleared a little bit, I'm a JS developer myself and I promote both poly and libraries/frameworks/utilities since ever.

If anyone from TC39 would tell me: "dude, this is bad because not future friendly" I'd either put that info on the README of the GitHub repo or tell people about it even if it's my lib.

Best

# doodad-js Admin (7 years ago)

“TC39 consider the usage of eval inappropriate for production”

And what about dynamic code, expressions evaluation, ...? Who has wake up one day and decided that nobody should use “eval” ?

From: Andrea Giammarchi [mailto:andrea.giammarchi at gmail.com] Sent: Saturday, July 22, 2017 1:44 PM To: kai zhu <kaizhu256 at gmail.com <mailto:kaizhu256 at gmail.com> >

Cc: es-discuss <es-discuss at mozilla.org <mailto:es-discuss at mozilla.org> >

Subject: Re: Removal of language features

answering to all questions here:

What problems would this address?

It will give developers a clear indication of what's good and future proof and what's not so cool.

MooTools and Prototype extending natives in all ways didn't translate into "cool, people like these methods, let's put them on specs" ... we all know the story.

Having bad practices promoted as "cool stuff" is not a great way to move the web forward, which AFAIK is part of the manifesto too.

In general, the committee sees any tool with significant adoption as an opportunity to learn/draw ideas from, not a plague.

That's the ideal situation, reality is that there are so many Stage 0 proposals instantly adopted by many that have been discarded by TC39.

This spans to other standards like W3C or WHATWG, see Custom Elements builtin extends as clear example of what I mean.

Committee might have the right opinion even about proposed standards, not even developers experimenting, so as much I believe what you stated is true, I'm not sure that's actually what happens. There are more things to consider than hype, and thanks gosh it's like that.

you wouldn't see any interest in policing libraries and frameworks from the committee

agreed, because policing is a strong term. "TC39 friendly" is what I was thinking of, something like any other GitHub badge when it comes to code coverage, coding style, or target engines.

I'm a pioneer of any sort of hacks myself, but I know that if my new library is fully implemented thanks to eval and global prototype pollution, through transpiled code that uses reserved words, probably nobody beside me inside my crazy-lab should use my own code, no matter how much I promote it.

This is in conflict with the extensible web manifesto

The situation I've just described would be indeed against the web manifesto, wouldn't it?

tc39 should be a bit more assholish imo.

No it shouldn't, it should be open minded and able to listen too. However, when TC39 makes a decision the JS community follows quite religiously that decision.

If TC39 says everything is fine, you have today situation you describe.

If TC39 would give some little extra direction, you'd have people thinking about what they're using daily, example:

statement: TC39 considers Stage 1 unstable and it should never be used on production.

result: people using early transpilers cannot complain about anything about it and it's their choice.

statement: TC39 consider the usage of eval inappropriate for production

result: people using any library fully based on eval or Function would start looking for better options

And so on, I hope my previous email is now cleared a little bit, I'm a JS developer myself and I promote both poly and libraries/frameworks/utilities since ever.

If anyone from TC39 would tell me: "dude, this is bad because not future friendly" I'd either put that info on the README of the GitHub repo or tell people about it even if it's my lib.

Best

# Andrea Giammarchi (7 years ago)

CSP to name one, but you picked 1% of my reply.

# Naveen Chawla (7 years ago)

Typescript allows breaking changes, ES doesn't.

Hence it would be an acceptable decision for ES to clash with an existing Typescript keyword and force Typescript to update accordingly.

Typescript developers shouldn't be unprepared, and ES can continue on its path.

None of this makes Typescript "bad". Developers can keep using their existing version of Typescript and its transpiler if they don't want to risk disruption.

So this kind of works for everybody: those who want bleeding edge ideas implemented and are prepared to update in the face of breaking changes can use e.g. Typescript and keep updating its version; those who want current bleeding edge ideas implemented but not risk breaking changes can use e.g. Typescript but stick to the same version; those who want to use the latest features of ES can do so directly; those who want old ES code to continue to work can have that. So it seems all of these cases are serviced OK.

I'm not sure it's TC39's job to mark the implementation of preliminary ideas as "unfriendly". If anything such implementations could expose any weaknesses of these ideas such that they can be improved upon, or if not, exposed as required as-is, potentially more clearly than a hypothetical discussion on them, and that would carry value in of itself.

So Javascript and Typescript serve different purposes. Typescript, being as it is transpiled to Javascript, has the luxury of not having to be backwards compatible, whereas because Javascript is run directly on browsers, it has to be.

# Steve Fink (7 years ago)

On 07/21/2017 03:00 PM, kai zhu wrote:

Can you produce any data at all to back that up? I've never seen any appetite in that regard at all. no hard data admittedly. i regularly attend tech meetups in hong kong. at these gatherings, the general sentiment from frontend developers is that creating webapps has gotten considerably more difficult with the newish technologies. even the local presenters for react and angular2+ at these talks can’t hide their lack of enthusiasm for these frameworks (like they’re doing it mainly to hustle more side-business for themselves). on-the-job, we all generally try to avoid taking on such technical risks, until we are inevitably asked to by our manager. the ones i see who are enthusiastic are typically non-frontend-engineers who write mostly backend nodejs code, that isn’t all that more scalable or interesting with what you can do in java/c#/ruby.

I think this is mixing up frameworks with the language. There is indeed extreme framework fatigue, and has been for quite some time. Language changes have been much slower and mind-bending, and it is my (uninformed) impression that people generally haven't had too much difficulty incorporating them. Or at least, not nearly as much as learning the mindset of eg React or Flux or Angular or whatever. And there seems to usually be a sense of relief when something gets added to the language that removes the need for the workarounds that the libraries and frameworks have been using for some time.

# Steve Fink (7 years ago)

This makes sense to me. Though I kind of feel like the discussion has veered off on a less useful direction because of reactions to words like "policing" or "gatekeeping". It may be more productive to consider whether it might be useful to have a mechanism whereby frameworks could leverage the expertise of people close to tc39. If I were a framework author (and I'm not), I would appreciate having the ability to say "hey, I'm thinking of doing X. What current or potential problems could X run into with respect to ES?" The expectation is that I would take the feedback into account (so tc39 people wouldn't feel like they were shouting into the void, or participating in a meaningless feel-good opportunity.) TC39 would benefit by having some degree of influence (not control!) over the more unfortunate directions of frameworks, as well as getting more exposure to the sorts of problems people are running into.

Anyway, I don't have a dog in any of these races. (Hell, I'm more of a cat person to begin with.) I just see the conversation taking a less than useful path, and wanted to point it out.

# Bob Myers (7 years ago)

Some comments on various posts in this thread:

  1. Asia has more than four billion people. Can we please avoid making generalizations about the level of competence of engineering managers in that region to make risk/benefit trade-offs?

  2. I don't understand the TC39 process, but I am guessing they are not authorized to make pronouncements about framework friendliness. Given the overall level of bureaucracy, it would probably take one year, if that, to even amend the charter to include the ability to issue such moral judgments in their mission. Then each individual judgment would take months at best to be approved. Then such judgments, once issued sometime in 2019, would be widely ignored by the community.

  3. If someone is worried about the rogue designers on the TypeScript team hijacking the enum keyword, why not just adopt it as is? The design is fine as it stands.

  4. I cannot but read some of the comments in this thread as amounting to saying that no-one should be allowed to innovate outside the TC39 framework, and no-one should be allowed to adopt such innovations. As an unabashed TypeScript fan, this puzzles me. At our company, we are not only using TypeScript for all front-end work, but also for node.js back-ends. The decision to do so was made by the consensus of a number of knowledgeable managers who are fully cognizant of the risk/reward equation. There is an undeniable, demonstrable improvement in programming productivity and code quality. While TC39 was bike-shedding about exponentiation precedence, or adding padLeft, the TypeScript team was making real, meaningful, useful, real-world improvements.

  5. I think there is a tendency to underestimate the level of engineering excellence embodied in TypeScript, and their self-understanding of where they fit in the JS ecosystem. Personally, I have a great deal of confidence in their ability to adapt to and maintain compatibility with evolving TC39 standards. As Maggie wisely points out, we should learn from them, not piss on them.

Bob

# Jordan Harband (7 years ago)

Please note, "piss on them" is certainly achieved by your antagonistic comments about TC39's "bikeshedding" on two valuable recent proposals, which do solve real-world problems - even if they aren't your own.

I'll withhold comment on the rest of the thread beyond a general comment to everyone (everyone, to be clear, not just the person my previous comment is in reply to): please behave in a professional manner; please refrain from making any kind of sexual references; please do not generalize about any groups of people, be they TypeScript users, TC39 members, people who live in Asia, etc. My personal suspicion is that people who are unable to have self-control in these matters will find themselves losing the ability to comment further on es-discuss, but either way, let's be courteous.

Thanks!

;

# Vinnymac (7 years ago)

Jordan is right about this needing to be a more civil discussion. The hostile environment prevents people from even wanting to participate in the conversation, including myself.

Above Steve mentions that many people are mixing language additions with framework fatigue. I have to agree with him. In my case I am not overwhelmed by any of the additions TC39 has chosen to make to ECMA. In fact it is something I look forward to each year now that things seem to be iterating at a faster rate. It feels more mature, and we can already do so much more than we ever could just a couple of years ago.

I feel the language shouldn't need to remove a feature unless it is really absolutely necessary. New and abundant proposals are a good thing. It means we have a lot of requests to make the language fit all of the use cases and needs that exist. TC39 has a pretty great proposal process [1], and I imagine that in and of itself will help protect from proposals unnecessarily entering into the language.

I don't think TC39 should need to babysit or have a special line of communication with any specific libraries. Smells of lobbying and politics.

[1] tc39.github.io/process-document

# T.J. Crowder (7 years ago)

Massive +1 on Jordan's call for increased civility and cleaner language.

On Sun, Jul 23, 2017 at 3:58 AM, Vinnymac <vinnymac at gmail.com> wrote:

Above Steve mentions that many people are mixing language additions with framework fatigue. I have to agree with him. In my case I am not overwhelmed by any of the additions TC39 has chosen to make to ECMA.

Absolutely. Framework fatigue? Yes. Language change fatigue? Not at all. The opposite, if anything -- eagerness to get to using the new stuff.

I don't think TC39 should need to babysit or have a special line of communication with any specific libraries. Smells of lobbying and politics.

Yes indeed. Similarly:

On Sat, Jul 22, 2017 at 4:14 PM, Andrea Giammarchi <andrea.giammarchi at gmail.com> wrote:

TBH, I'd love to have an official review channel for hacky practices used on frameworks, no matter how big is their company size, and flag officially as TC39 friendly.

Same problem (lobbying and politics), as well as scope creep (that's not their job) and limited resources (I'd rather they spent their TC39 time on the main mission: moving the language foward).

On Sat, Jul 22, 2017 at 6:44 PM, Andrea Giammarchi <andrea.giammarchi at gmail.com> wrote (replying to kai zhu <kaizhu256 at gmail.com>):

tc39 should be a bit more [expletive deleted] imo.

No it shouldn't, it should be open minded and able to listen too.

Amen.

However, when TC39 makes a decision the JS community follows quite religiously that decision.

If TC39 says everything is fine, you have today situation you describe.

I haven't seen any indiction from TC39 that using Stage 1 proposals in production is "fine."

If TC39 would give some little extra direction, you'd have people thinking about what they're using daily, example:

statement: TC39 considers Stage 1 unstable and it should never be used on production.

result: people using early transpilers cannot complain about anything about it and it's their choice.

statement: TC39 consider the usage of eval inappropriate for production

result: people using any library fully based on eval or Function would start looking for better options

I wouldn't have thought such statements were required, given the way the stages are described in the process document. Reading through that, if you're relying on anything that isn't at least at Stage 4 or well into Stage 3, you should know that it could change significantly or go away entirely, and be prepared for that. Caveat usor.

But sure, perhaps a "guidelines for use in production" section would be useful. The people who are relying on Stage 1 stuff presumably haven't read the process document anyway and so won't see the guidelines, but having them there makes it easier for people to point out to them the dangers (or at least, considerations) of what they're doing, backed by a link to the document. E.g., that if they want to use X, that's their choice, but they should be aware they'll probably have to refactor at some point, perhaps repeatedly, as it evolves into something stable.

-- T.J. Crowder

# Darien Valentine (7 years ago)

But sure, perhaps a "guidelines for use in production" section would be useful. [...] having them there makes it easier for people to point out to them the dangers (or at least, considerations) of what they're doing, backed by a link to the document [...]

In my own experience, that might have been a useful thing to have a few times. A few years back I didn’t have the proper context for making decisions about what the consequences of using stage 0-2 features might be over time. I came to regret that. Later I had a better understanding of this and had come to feel pretty strongly that it was unwise to use anything less than stage 3 for projects that are expected to have a long future. However it’s not always easy to convince people of this, especially if they themselves can point to a lot of stuff that actually advises using pre-stage 3 proposed features and non-standard extensions.

Being able to point to a formal statement about what the stages mean not just from the point of view of TC39’s internal process, but also what they imply for a consumer standpoint — chance of ultimate inclusion, overall stability — would be helpful.

Would it actually lead to more careful decision making? I’m not sure, but consider the mental health benefits: having linked to such an official doc during a discussion of these concerns would mark a point after which one might say to oneself: "okay... well, one did try". Then, rather than continue such a discussion endlessly, one may instead grant oneself a few moments to stare out a window wistfully, accept fate, sigh, and think about ponies.

# David White (7 years ago)

Lots of good thoughts and discussions here, and while it’s gone slightly off topic I’d love to discuss the possibilities of how we could get JavaScript to a point where we could actively remove features with every new specification.

I’m sure nobody would want to break the web, which would be very likely removing any parts of JavaScript, and certainly the biggest challenge, it does seem a shame that we can’t find an ulterior direction as it does seem allowing various features we consider bad practice today to still exist and the overhead that exists with them certainly hinders progress more than it helps.

Linting is certainly the fastest and easiest method, but to a certain extent I not really a solution in that we only lint our own code, and not the additional code that we rely upon. Ideally removal of features should mean more performance out of JavaScript, if engines have less constructs to deal with then there should be some beneficial performance related with that?

Given the lack of control over what browsers many users are using perhaps versioning could be a new semantic built into the language itself in the same way we have strict mode?

We could allow developers the option to specify the version they wish to use, avoiding unnecessary transpiration back to ES5 for applications confident enough to give their users the choice to upgrade if needed but also allow browsers to only run based on versions?

I'm sure it’s worth considering as removing features of a language / application is as important, if not more so, than adding features to a language or application.

# kai zhu (7 years ago)

On Jul 23, 2017, at 10:58 AM, Vinnymac <vinnymac at gmail.com> wrote:

Above Steve mentions that many people are mixing language additions with framework fatigue. I have to agree with him. In my case I am not overwhelmed by any of the additions TC39 has chosen to make to ECMA. In fact it is something I look forward to each year now that things seem to be iterating at a faster rate.

-1 strongly disagree. the explosion of different frameworks is encouraged by the current unstable nature of ecmascript. the phenomenon wouldn't have been so severe if there wasn’t the mindset that ecmascript is undergoing a "language revolution", and everyone had to write their own framework to adapt to it.

It feels more mature, and we can already do so much more than we ever could just a couple of years ago.

-1 in the end-goal of browser UX capabilities, i feel the latest batch of frameworks don’t add anything more capable than the older simpler ones. they simply employ more complicated procedures to achieve the final desired UX feature.

i feel the commercial web-industry is now more wanting on guidance to reliably ship and maintain products. the views of some people that ecmascript should further expand and develop new ideas, hardly helps in this regard.

# doodad-js Admin (7 years ago)

Maybe that's time to start a new major version of JS?

# doodad-js Admin (7 years ago)

To be honest, I started my own framework because of the lack of classical oop and a clear type system in JS. I know TypeScript, but that’s another language, not just a framework like mine. After that, ES6 classes has come to the surface, but they do not respond to my needs, and some choices on their behavior are not convenient. I’m waiting for new proposals like public/private fields and decorators, but just for the purpose of transpiling my classes to those of ES6 the most than possible for performance. I don’t think to have enough influence to get my classes system reflected in JS :)

From: kai zhu [mailto:kaizhu256 at gmail.com] Sent: Sunday, July 23, 2017 6:36 PM To: Vinnymac <vinnymac at gmail.com>

Cc: Jordan Harband <ljharb at gmail.com>; Claude Petit <petc at webmail.us>; es-discuss <es-discuss at mozilla.org>

Subject: Re: Removal of language features

On Jul 23, 2017, at 10:58 AM, Vinnymac <vinnymac at gmail.com <mailto:vinnymac at gmail.com> > wrote:

Above Steve mentions that many people are mixing language additions with framework fatigue. I have to agree with him. In my case I am not overwhelmed by any of the additions TC39 has chosen to make to ECMA. In fact it is something I look forward to each year now that things seem to be iterating at a faster rate.

-1

strongly disagree. the explosion of different frameworks is encouraged by the current unstable nature of ecmascript. the phenomenon wouldn't have been so severe if there wasn’t the mindset that ecmascript is undergoing a "language revolution", and everyone had to write their own framework to adapt to it.

It feels more mature, and we can already do so much more than we ever could just a couple of years ago.

-1

in the end-goal of browser UX capabilities, i feel the latest batch of frameworks don’t add anything more capable than the older simpler ones. they simply employ more complicated procedures to achieve the final desired UX feature.

i feel the commercial web-industry is now more wanting on guidance to reliably ship and maintain products. the views of some people that ecmascript should further expand and develop new ideas, hardly helps in this regard.

www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient

Virus-free. www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient, www.avg.com

# David White (7 years ago)

That’s an interesting proposal but I’m struggling to think how that would solve this same issue 2 or 3 or 5 years down the line? JavaScript and browsers have a symmetry along with HTML, CSS... and given the disparity of devices today alone, let alone tomorrow a new major version would cause more harm than good. Ideally we would need a solution that works as ES5 as standard, then have that semantic in later versions of the language.

Perhaps this is not a problem for this community but something browser providers should be providing / considering to allow developers to specify their runtime in order to allow the language to progress more natrurally?

# David White (7 years ago)

Ooooh, a mime type based versioning would be very nice!

<script src=“/myapp.js” type=“application/javascript.2”></script>

For the most part you control your applications language decision and honour that with your bundle then load additional scripts you have little control over such as logging, monitoring, stats, etc in different runtimes perhaps.

It’s certainly cleaner than injecting a version into the top of your bundle and would allow 3rd party vendors to provide version specific bundles.

# doodad-js Admin (7 years ago)

And more, engines could signal they support JS 2 just by the Accept header!

# doodad-js Admin (7 years ago)

I'm dealing with 4 emails, so sometimes I don't select the good one and get rejected by es-discuss :) This is my latest email...

Claude Petit

# Owen (7 years ago)

It may be worth looking at Python 3 as a cautionary tail. For example, here is a mailing list post from 2009 debating whether it is worth adding features to 2.7 given that 3 is on the way: mail.python.org/pipermail/python-dev/2009-October/093204.html . Eight years later, Python 2.7 shows no signs of going away, and the language has become, at least for portable code, basically static since then.

Of course, although the Python developers used an abundance of caution in rolling out a major version, it remains an open question whether adoption would have been smoother had the C API not changed. It may be that this difficulty would not affect JavaScript, and a new major version would be more feasible.

# kdex (7 years ago)

This was kind of the original idea around Dart IIRC. It didn't see much traction though, and eventually got called off.

# Isiah Meadows (7 years ago)

Thought I'd clarify why I brought TypeScript in from:

  • TypeScript in of itself is not a problem.
  • Angular's use and embrace of TypeScript in of itself is not a problem.
  • TypeScript was more of a tangentially related side detail.

I was only referring to Angular embracing a stage 2 proposal, specifically decorators (with TypeScript extensions thereof), that had become rather unstable a few months before they went beta with v2.0. It's a technical risk taken by a framework with already very high usage, one that introduced significant technical debt, and one I questioned from the start.

(I could understand if it were stage 3, gaining browser support. But to be clear, a year ago when they made the decision, it was a poor choice, being relatively new and immature for a stage 2 proposal.)


Isiah Meadows me at isiahmeadows.com

Looking for web consulting? Or a new website? Send me an email and we can get started. www.isiahmeadows.com

# Naveen Chawla (7 years ago)

Don't break the web.

Those who don't like old features just don't need to use them.

Applications that do use them would break!

If reducing the available feature set would increase performance then a declaration to the browser to ignore deprecated features should be enough to get this optimization, but only for this reason.

But do we even know that there is any performance gain for doing so?

I'm not bothered about browser code complexity to support old features - I don't see that code anyway. I don't know to what extent other javascript developers are bothered about it either.

# Sebastian Zartner (7 years ago)

On 24 July 2017 at 10:47, Naveen Chawla <naveen.chwl at gmail.com> wrote:

If reducing the available feature set would increase performance then a declaration to the browser to ignore deprecated features should be enough to get this optimization, but only for this reason.

But do we even know that there is any performance gain for doing so?

I'm not bothered about browser code complexity to support old features - I don't see that code anyway. I don't know to what extent other javascript developers are bothered about it either.

Well, JavaScript developers may not bother about the complexity of browser code, though browser developers, or more generally spoken, JavaScript engine developers should care about the complexity of their code. Supporting old features may not only be a question of engine performance but also memory consumption and (download) sizes of their programs. So, it should be in their interest to deprecate misfeatures, even when it takes 5, 10, or 15 years until their usage drops enough so they can actually be removed.

Sebastian

# Alexander Craggs (7 years ago)

Some thoughts on using MIME types to distinguish between versions:

1. In terms of including it in <script> tags, I don't think that is a good idea for including third party libraries in your code.  It would be hard to tell what version of JavaScript a library is using, I think it would be better to allow it to be defined by the library, instead of the user.

2. In response to "server can set MIME type", a lot of libraries are hosted on systems that don't allow you to change headers, for example Github pages.  Plus, there are several JavaScript CDNs out there that also don't let you set MIME type.

3. Not many newer individuals know how to set headers, I don't think I'd be able to set headers on Apache off the top of my head.

May be biased, suggested a separate method of versioning with the "use es8" deceleration at the top of a file.  As a side note, since I made the other thread that seems to be discussing almost the exact same topic as this one (versioning to add breaking changes vs. versioning to remove language features), do people think it would be a good idea to discuss both thoughts solely in this thread (the more popular one?).   On 25/07/2017 19:59:00, Sebastian Zartner <sebastianzartner at gmail.com> wrote:

On 24 July 2017 at 10:47, Naveen Chawla wrote:

If reducing the available feature set would increase performance then a declaration to the browser to ignore deprecated features should be enough to get this optimization, but only for this reason.

But do we even know that there is any performance gain for doing so?

I'm not bothered about browser code complexity to support old features - I don't see that code anyway. I don't know to what extent other javascript developers are bothered about it either.

Well, JavaScript developers may not bother about the complexity of browser code, though browser developers, or more generally spoken, JavaScript engine developers should care about the complexity of their code. Supporting old features may not only be a question of engine performance but also memory consumption and (download) sizes of their programs. So, it should be in their interest to deprecate misfeatures, even when it takes 5, 10, or 15 years until their usage drops enough so they can actually be removed.

Sebastian

# doodad-js Admin (7 years ago)
    1. In terms of including it in <script> tags, I don't think that is a good idea for including third party libraries in your code. It would be hard to tell what version of JavaScript a library is using, I think it would be better to allow it to be defined by the library, instead of the user.

That’s why I’ve suggested a new file extension (js2).

    1. In response to "server can set MIME type", a lot of libraries are hosted on systems that don't allow you to change headers, for example Github pages. Plus, there are several JavaScript CDNs out there that also don't let you set MIME type.

I’m talking about the “Accept” header, which is set by the client on each request. Browsers that support JS 2 will automatically append “application/javascript.2” to that header.

    1. Not many newer individuals know how to set headers, I don't think I'd be able to set headers on Apache off the top of my head.

That’s the base of Web development, but not required on that situation.

From: Alexander Craggs [mailto:alexander at debenclipper.com] Sent: Tuesday, July 25, 2017 3:16 PM To: Sebastian Zartner <sebastianzartner at gmail.com <mailto:sebastianzartner at gmail.com> >; Naveen Chawla <naveen.chwl at gmail.com <mailto:naveen.chwl at gmail.com> >

Cc: es-discuss at mozilla.org <mailto:es-discuss at mozilla.org>

Subject: Re: Removal of language features

Some thoughts on using MIME types to distinguish between versions:

May be biased, suggested a separate method of versioning with the "use es8" deceleration at the top of a file. As a side note, since I made the other thread that seems to be discussing almost the exact same topic as this one (versioning to add breaking changes vs. versioning to remove language features), do people think it would be a good idea to discuss both thoughts solely in this thread (the more popular one?).

# Alexander Craggs (7 years ago)

I'm sorry, I missed that suggestion.

That definitely sounds significantly better than a new MIME type.  Although two thoughts I have are:

- How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.

- Are extensions going to be released often, or is this going to be a one time thing?  For example, would we increment the version number with the current JS version (js6, js7 etc) and if so, would it make more sense to start on js7 instead of js2? On 25/07/2017 20:41:33, doodad-js Admin <doodadjs at gmail.com> wrote:

  • 1. In terms of including it in <script> tags, I don't think that is a good idea for including third party libraries in your code.  It would be hard to tell what version of JavaScript a library is using, I think it would be better to allow it to be defined by the library, instead of the user.   That’s why I’ve suggested a new file extension (js2).
  • 2. In response to "server can set MIME type", a lot of libraries are hosted on systems that don't allow you to change headers, for example Github pages.  Plus, there are several JavaScript CDNs out there that also don't let you set MIME type.   I’m talking about the “Accept” header, which is set by the client on each request. Browsers that support JS 2 will automatically append “application/javascript.2” to that header.
    1. Not many newer individuals know how to set headers, I don't think I'd be able to set headers on Apache off the top of my head.    That’s the base of Web development, but not required on that situation.     From: Alexander Craggs [mailto:alexander at debenclipper.com [mailto:alexander at debenclipper.com]] Sent: Tuesday, July 25, 2017 3:16 PM To: Sebastian Zartner <sebastianzartner at gmail.com [mailto:sebastianzartner at gmail.com]>; Naveen Chawla <naveen.chwl at gmail.com [mailto:naveen.chwl at gmail.com]>

Cc: es-discuss at mozilla.org [mailto:es-discuss at mozilla.org] Subject: Re: Removal of language features   Some thoughts on using MIME types to distinguish between versions:     May be biased, suggested a separate method of versioning with the "use es8" deceleration at the top of a file.  As a side note, since I made the other thread that seems to be discussing almost the exact same topic as this one (versioning to add breaking changes vs. versioning to remove language features), do people think it would be a good idea to discuss both thoughts solely in this thread (the more popular one?).

# doodad-js Admin (7 years ago)

My apologizes, you still should set the “Content-Type” header from the server as usual if that’s not automatically done.

# doodad-js Admin (7 years ago)
  • How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.

For inline, that’ll be:

<script type=”application/javascript.2>...</script>

For REPL, I don’t know... I didn’t think about this one :-) It should be based of the content of the page. And I don’t know if we should allow to mix different versions together. That’s things we’ll have to clarify.

  • Are extensions going to be released often, or is this going to be a one time thing?

Just at another new major revision of JS, which should not happen before a long time.

  • would it make more sense to start on js7 instead of js2

No, because ES6, ES7, ... are still JS 1.

From: Alexander Craggs [mailto:alexander at debenclipper.com] Sent: Tuesday, July 25, 2017 3:54 PM To: doodad-js Admin <doodadjs at gmail.com>; es-discuss at mozilla.org

Subject: Re: FW: Removal of language features

I'm sorry, I missed that suggestion.

That definitely sounds significantly better than a new MIME type. Although two thoughts I have are:

  • How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.

  • Are extensions going to be released often, or is this going to be a one time thing? For example, would we increment the version number with the current JS version (js6, js7 etc) and if so, would it make more sense to start on js7 instead of js2?

# Alexander Craggs (7 years ago)

I think version interoperability is a must in a world of Webpack & Browserify. On 25/07/2017 21:12:58, doodad-js Admin <doodadjs at gmail.com> wrote:

  • How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.   For inline, that’ll be:   <script type=”application/javascript.2>...</script>   For REPL, I don’t know... I didn’t think about this one :-) It should be based of the content of the page. And I don’t know if we should allow to mix different versions together. That’s things we’ll have to clarify.
  • Are extensions going to be released often, or is this going to be a one time thing?   Just at another new major revision of JS, which should not happen before a long time.
  • would it make more sense to start on js7 instead of js2   No, because ES6, ES7, ... are still JS 1.     From: Alexander Craggs [mailto:alexander at debenclipper.com] Sent: Tuesday, July 25, 2017 3:54 PM To: doodad-js Admin <doodadjs at gmail.com>; es-discuss at mozilla.org

Subject: Re: FW: Removal of language features   I'm sorry, I missed that suggestion.   That definitely sounds significantly better than a new MIME type.  Although two thoughts I have are:    - How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.    - Are extensions going to be released often, or is this going to be a one time thing?  For example, would we increment the version number with the current JS version (js6, js7 etc) and if so, would it make more sense to start on js7 instead of js2?

# Brendan Eich (7 years ago)

This thread makes me want to unsubscribe from es-discuss. I think I recreated the list. :-(

Please read esdiscuss.org/topic/no-more-modes and esdiscuss.org/topic/use-es6-any-plans-for-such-a-mode#content-2.

"Don't break the web" is not some vague high-minded notion of TC39's. It's a consequence of hard-to-change browser-market game theory. No browser wants to risk (however small the risk) breaking what might be more of the web than one thinks at first. It's very hard to find out what "the web" is and prove absence of breakage (paywalls, firewalls, archives, intranets, etc.). There's very little gain and potentially much pain, which could mean support calls and market share loss.

This is not just a browser market failure. Developers don't want their code broken, until they stop using something and then ask for it to be removed. That is not globally coordinated so it won't fly, as browser market share depends in part on developer testing and use of browsers. Ecosystem effects mitigate against breaking the web in deep ways, in general.

Yet ECMA-262 has broken compatibility in a few edge cases. And browser competition led to some dead limbs and underspecified pain-points (e.g., global object prototype chain).

And Google people seem to be leading the Web Intervention Community Group, which wants to break the web a bit (rather than block 3rd party ad/tracking scripts :-P). So perhaps we can break some DOM APIs such as sync touch events, without also breaking gmail :-). The jury is still out in my view, but Chrome has enough market power to push and assume more risk than other browsers.

Core language changes are different in kind from sync touch events. It's very hard to plan to remove anything on a practical schedule or order-of-work basis. Engine maintainers likely still hate more modes, and users should too. New syntax as its own opt-in still wins, although this obligates TC39 to work on future-proofing, e.g., : after declarator name in declaration for type annotation syntax.

So based on 22+ years doing JS, I believe anything like opt-in versioning for ES4, a la Python3 or Perl6, is a non-starter. Period, end of story.

Ok, I'm not unsubscribing -- but I hope more people read and search esdiscuss.org and engage with history instead of coming as if to a blank slate. Santayana's dictum applies.

# Bruno Jouhier (7 years ago)

Reading this thread, it feels that cleaning the language raises more problems than it solves. It is not even clear how versions should be flagged.

TC39 has worked very hard to keep the language backward compatible and avoid "breaking the web". So only a very strong motive would justify removal of features. Security is one and that explains why arguments.callee is going away, but I don't see others. Even performance isn't a strong enough motive: libraries that don't perform because of inefficient language features (with, eval) will just die or be replaced. No need to be proactive here, just let the Darwinian process take its course.

Language cleanup is the business of linters. They let you enforce modern features within your teams/projects, without impacting others nor existing libraries.

Derived languages and transpilers (TypeScript) are the perfect place to experiment with new features. This is much better than taking chances with JavaScript itself.

If it ain't broke, don't fix it.

Bruno

# doodad-js Admin (7 years ago)

We are just talking on how we can enhance JS without “breaking the web”. And the solution we are talking about is to make a major revision of JS (JS version 2), instead of breaking the current one (JS version 1).

From: Brendan Eich [mailto:brendan.eich at gmail.com] Sent: Tuesday, July 25, 2017 5:51 PM To: Alexander Craggs <alexander at debenclipper.com>; doodad-js Admin <doodadjs at gmail.com>; es-discuss at mozilla.org

Subject: Re: FW: Removal of language features

This thread makes me want to unsubscribe from es-discuss. I think I recreated the list. :-(

Please read esdiscuss.org/topic/no-more-modes and esdiscuss.org/topic/use-es6-any-plans-for-such-a-mode#content-2.

"Don't break the web" is not some vague high-minded notion of TC39's. It's a consequence of hard-to-change browser-market game theory. No browser wants to risk (however small the risk) breaking what might be more of the web than one thinks at first. It's very hard to find out what "the web" is and prove absence of breakage (paywalls, firewalls, archives, intranets, etc.). There's very little gain and potentially much pain, which could mean support calls and market share loss.

This is not just a browser market failure. Developers don't want their code broken, until they stop using something and then ask for it to be removed. That is not globally coordinated so it won't fly, as browser market share depends in part on developer testing and use of browsers. Ecosystem effects mitigate against breaking the web in deep ways, in general.

Yet ECMA-262 has broken compatibility in a few edge cases. And browser competition led to some dead limbs and underspecified pain-points (e.g., global object prototype chain).

And Google people seem to be leading the Web Intervention Community Group, which wants to break the web a bit (rather than block 3rd party ad/tracking scripts :-P). So perhaps we can break some DOM APIs such as sync touch events, without also breaking gmail :-). The jury is still out in my view, but Chrome has enough market power to push and assume more risk than other browsers.

Core language changes are different in kind from sync touch events. It's very hard to plan to remove anything on a practical schedule or order-of-work basis. Engine maintainers likely still hate more modes, and users should too. New syntax as its own opt-in still wins, although this obligates TC39 to work on future-proofing, e.g., : after declarator name in declaration for type annotation syntax.

So based on 22+ years doing JS, I believe anything like opt-in versioning for ES4, a la Python3 or Perl6, is a non-starter. Period, end of story.

Ok, I'm not unsubscribing -- but I hope more people read and search esdiscuss.org and engage with history instead of coming as if to a blank slate. Santayana's dictum applies.

/be

On Tue, Jul 25, 2017 at 2:10 PM Alexander Craggs <alexander at debenclipper.com <mailto:alexander at debenclipper.com> > wrote:

I think version interoperability is a must in a world of Webpack & Browserify.

On 25/07/2017 21:12:58, doodad-js Admin <doodadjs at gmail.com <mailto:doodadjs at gmail.com> > wrote:

  •     How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.
    

For inline, that’ll be:

<script type=”application/javascript.2>...</script>

For REPL, I don’t know... I didn’t think about this one :-) It should be based of the content of the page. And I don’t know if we should allow to mix different versions together. That’s things we’ll have to clarify.

  •     Are extensions going to be released often, or is this going to be a one time thing? 
    

Just at another new major revision of JS, which should not happen before a long time.

  •     would it make more sense to start on js7 instead of js2
    

No, because ES6, ES7, ... are still JS 1.

From: Alexander Craggs [mailto:alexander at debenclipper.com <mailto:alexander at debenclipper.com> ]

Sent: Tuesday, July 25, 2017 3:54 PM To: doodad-js Admin <doodadjs at gmail.com <mailto:doodadjs at gmail.com> >; es-discuss at mozilla.org <mailto:es-discuss at mozilla.org>

Subject: Re: FW: Removal of language features

I'm sorry, I missed that suggestion.

That definitely sounds significantly better than a new MIME type. Although two thoughts I have are:

  • How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.

  • Are extensions going to be released often, or is this going to be a one time thing? For example, would we increment the version number with the current JS version (js6, js7 etc) and if so, would it make more sense to start on js7 instead of js2?

# doodad-js Admin (7 years ago)

With something like Babel, we should be able to transpile from JS 2 to JS 1 for backward compatibility. We are already doing it from ES6/ES7 to ES5.

From: Alexander Craggs [mailto:alexander at debenclipper.com] Sent: Tuesday, July 25, 2017 5:08 PM To: doodad-js Admin <doodadjs at gmail.com>; es-discuss at mozilla.org

Subject: RE: FW: Removal of language features

I think version interoperability is a must in a world of Webpack & Browserify.

On 25/07/2017 21:12:58, doodad-js Admin <doodadjs at gmail.com <mailto:doodadjs at gmail.com> > wrote:

  •     How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.
    

For inline, that’ll be:

<script type=”application/javascript.2>...</script>

For REPL, I don’t know... I didn’t think about this one :-) It should be based of the content of the page. And I don’t know if we should allow to mix different versions together. That’s things we’ll have to clarify.

  •     Are extensions going to be released often, or is this going to be a one time thing? 
    

Just at another new major revision of JS, which should not happen before a long time.

  •     would it make more sense to start on js7 instead of js2
    

No, because ES6, ES7, ... are still JS 1.

From: Alexander Craggs [mailto:alexander at debenclipper.com] Sent: Tuesday, July 25, 2017 3:54 PM To: doodad-js Admin <doodadjs at gmail.com <mailto:doodadjs at gmail.com> >; es-discuss at mozilla.org <mailto:es-discuss at mozilla.org>

Subject: Re: FW: Removal of language features

I'm sorry, I missed that suggestion.

That definitely sounds significantly better than a new MIME type. Although two thoughts I have are:

  • How are you going to deal with scenarios that don't have extensions, e.g. REPL or inline JS.

  • Are extensions going to be released often, or is this going to be a one time thing? For example, would we increment the version number with the current JS version (js6, js7 etc) and if so, would it make more sense to start on js7 instead of js2?

www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient

Virus-free. www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient, www.avg.com

# Mark (7 years ago)

I hope more people read and search esdiscuss.org and engage with history instead of coming as if to a blank slate.

​ In all fairness, es-discuss is rather ancient in the way it works. I personally would recommend es-discuss coming up with a better way to keep track of its threads. The current setup is rather confusing, imo. FWIW, I personally would recommend Discourse www.discourse.org. I agree

with you on the same topics coming up constantly though. ​ ​.

# Andreas Rossberg (7 years ago)

As for the reoccurring assumption that deprecation would help simplifying JavaScript implementations: no, not to a relevant degree. 80+% of the complexity in a JS VM comes from the plethora of (sometimes ridiculous) edge cases in the core semantics of JavaScript, its object model, implicit conversions, etc., and the desire to make all that fast in the common case without breaking correctness of the million special cases. None of that can be deprecated without creating a completely new language.

And clearly, modes or versions only make things worse in that regard. Strict mode already is a pig when it comes to implementation complexity (in retrospect, it does not carry its weight IMHO). ES6 made it worse. Our experiments with strong mode a while ago increased complexity even further, so much that the urge to rip it out again overtook very quickly. I for one am eternally healed of modes.

# Michał Wadas (7 years ago)

Simple idea:

  • Add new Annex to language.
  • Define operation EmitDeprecationWarning(code) - implementations MAY show deprecation warning in implementation dependent way (it can depend on runtime flag, dev tools, not minified code, etc.); otherwise operation EmitDeprecationWarning is noop
  • Define when implementations SHOULD emit deprecations warnings - like using with statement, non-standard Reg-Exp properties, compile method, assign to arguments, getYear etc.
  • Language features can be removed after 10 (15?) years
# Brendan Eich (7 years ago)

is this the best link to the Strong Mode post-mortem?

groups.google.com/d/msg/strengthen-js/ojj3TDxbHpQ/5ENNAiUzEgAJ

# Andreas Rossberg (7 years ago)

On 26 July 2017 at 17:38, Brendan Eich <brendan.eich at gmail.com> wrote:

Hi Andreas, is this the best link to the Strong Mode post-mortem?

groups.google.com/d/msg/strengthen-js/ojj3TDxbHpQ/5ENNAiUzEgAJ

Yup.

# Brendan Eich (7 years ago)

On Wed, Jul 26, 2017 at 4:44 AM Michał Wadas <michalwadas at gmail.com> wrote:

Simple idea:

  • Add new Annex to language.
  • Define operation EmitDeprecationWarning(code) - implementations MAY show deprecation warning in implementation dependent way (it can depend on runtime flag, dev tools, not minified code, etc.); otherwise operation EmitDeprecationWarning is noop

Who sees the warnings? Publishers hire contractors to build (re-build) site in year N, in year N+M when contractors are long gone, users visit site and get unseen warnings in their unopened devtools. What upper bound can be put on M?

  • Define when implementations SHOULD emit deprecations warnings - like using with statement, non-standard Reg-Exp properties, compile method, assign to arguments, getYear etc.

Who sees the warnings? Not the contractors, they are long gone by year N+1.

  • Language features can be removed after 10 (15?) years

So M=10 might work (who knows?) but it's so long a time frame that no one will act on the remote threat of breakage. And yet at Y=N+M, there will be sites (not just web.archive.org) using the old feature, I would bet real money. We know this because from looking back at when the Web was smaller and easier to coordinate.

Your model seems to assume a small-world-network coordination system. That's not the Web.

I created JS in 1995. In 1996 I made a few incompatible changes to JS and got away with it, but not in 1997. ES3 was done in 1999 based on de-facto work in Netscape and IE that converged (mostly; a few edge cases) around the same time, but even by 1998 the only way to coordinate was via the ECMA-262 standards work, not just ES1 but the discussions about future work we were having in 1997.

This kind of TC39 coordination helps for sure, don't get me wrong. But it does not solve the publisher/contractor division of labor leaving M effectively unbounded.

For a language like Java or C# used server side, where the retrograde sites can stick to old tool/runtime versions as long as vendors support them, M can be a "Goldilocks" interval, not too big, not too small. The threat of vendors obsoleting old versions pushes most customers to upgrade in time, and the customers of size can push back and keep support going an extra year or three if need be.

But that's not the Web. On the web, you don't just have the publishers and contractors, you have browser users also not coordinated except possibly by loose rules about supported browsers (banks try this and still get it wrong). Most sites do not want to turn away users based on detailed user agent version checks.

Suppose TC39 said "with" was going away in 2027. Who among content owners, developers they hire sporadically, or browser users visiting their sites would do anything, and why would they do it? If a browser in 2027 ships without "with" support ahead of other major browsers, what happens to its support costs and market share?

I hope this helps. It's very hard to remove things on the Web. That's the nature of the beast.

# Mike Samuel (7 years ago)

On Wed, Jul 26, 2017 at 5:55 AM, Andreas Rossberg <rossberg at google.com> wrote:

And clearly, modes or versions only make things worse in that regard. Strict mode already is a pig when it comes to implementation complexity (in retrospect, it does not carry its weight IMHO). ES6 made it worse. Our

IIRC, the primary argument for strict mode wasn't implementation simplicity, but the ability to do sound static analysis.

var x; function f(a, b) { a(b); return x; }

isn't analyzable because f(eval, 'var x = 1;') could cause the returned x to refer to a local instead of the outer x but add "use strict" to either scope and suddenly it is statically analyzable.

When you say that strict mode "does not carry its weight," are you saying that that the ability to do sounds static analysis doesn't warrant the additional complexity or are you referring to a different bundle of benefits?

# Brendan Eich (7 years ago)

One thing that may not be obvious:

On Wed, Jul 26, 2017 at 8:52 AM Brendan Eich <brendan.eich at gmail.com> wrote:

I created JS in 1995. In 1996 I made a few incompatible changes to JS and got away with it, but not in 1997. ES3 was done in 1999 based on de-facto work in Netscape and IE that converged (mostly; a few edge cases) around the same time, but even by 1998 the only way to coordinate was via the ECMA-262 standards work, not just ES1 but the discussions about future work we were having in 1997.

Netscape had effective monopoly control of JS in 1995 and into 1996, but was losing it by 1997 with IE4 coming out. No browser has it now, although Chrome has the most market power.

Even monopolies cannot repeal price law -- they can only force deadweight losses on customers up to a limit where the customer does without, or else substitutes by going outside the monopoly system. With JS, there was risk at the limit of users going without JS. There was risk too, small at first but growing to large, of users substituting IE and even using VBScript instead of JS.

Fortunately ;-), JS was first and good-enough, and standardized enough, to head off the VBScript substitution.

So I couldn't just change JS any way I wanted based on market power. Nor can Chrome now, or in a future where it got closer to IE's top (2004?) share of 95% (per wikipedia).

# Andreas Rossberg (7 years ago)

On 26 July 2017 at 18:10, Mike Samuel <mikesamuel at gmail.com> wrote:

On Wed, Jul 26, 2017 at 5:55 AM, Andreas Rossberg <rossberg at google.com> wrote:

And clearly, modes or versions only make things worse in that regard. Strict mode already is a pig when it comes to implementation complexity (in retrospect, it does not carry its weight IMHO). ES6 made it worse. Our

IIRC, the primary argument for strict mode wasn't implementation simplicity, but the ability to do sound static analysis.

Right, I was merely lumping a reply to two different suggestions into a single reply.

var x; function f(a, b) { a(b); return x; }

isn't analyzable because f(eval, 'var x = 1;') could cause the returned x to refer to a local instead of the outer x but add "use strict" to either scope and suddenly it is statically analyzable.

Actually, it cannot. An indirect call to eval cannot inject anything into the caller scope.

On the other hand, any use of indirect eval can inject something into the global scope, whether the caller is in strict mode or not. Overall, I thus don't think that strict mode makes JavaScript sufficiently better.

When you say that strict mode "does not carry its weight," are you saying that that the ability to do sounds static analysis doesn't warrant the additional complexity or are you referring to a different bundle of benefits?

The "ability to do sound static analysis" is not a binary characteristics. You can do analysis on JS. With strict mode you have a couple more invariants, so can do slightly better, but from my perspective it's not even close to a game changer.

# Brendan Eich (7 years ago)

Strict mode also made runtime-incompatible changes, e.g. arguments[i] not aliasing i'th formal parameter, which required two-way testing while strict mode adoption was nascent or partial (which of course many devs skipped).

On Wed, Jul 26, 2017 at 9:53 AM Andreas Rossberg <rossberg at google.com>

wrote:

The "ability to do sound static analysis" is not a binary characteristics. You can do analysis on JS. With strict mode you have a couple more invariants, so can do slightly better, but from my perspective it's not even close to a game changer.

Agreed (static analysis approximates runtime, so the ability to do it is of course not binary -- many trade-offs).

From my memory of the meetings and online discussions, strict mode was not

meant to make static analysis significantly easier. More important was enabling Caja (now SES) to "use strict" and do less work, static and at runtime. Implementation and legacy loopholes continue to confound such efforts :-).

# Allen Wirfs-Brock (7 years ago)

On Jul 26, 2017, at 10:56 AM, Brendan Eich <brendan.eich at gmail.com> wrote:

From my memory of the meetings and online discussions, strict mode was not meant to make static analysis significantly easier. More important was enabling Caja (now SES) to "use strict" and do less work, static and at runtime. Implementation and legacy loopholes continue to confound such efforts :-).

From my memory, static analysis of eval impact was definitely a motivator for the strict mode eval semantics. Similarly, it was part of the motivation for eliminating with.

# Mike Samuel (7 years ago)

On Wed, Jul 26, 2017 at 1:56 PM, Brendan Eich <brendan.eich at gmail.com> wrote:

Strict mode also made runtime-incompatible changes, e.g. arguments[i] not aliasing i'th formal parameter, which required two-way testing while strict mode adoption was nascent or partial (which of course many devs skipped).

On Wed, Jul 26, 2017 at 9:53 AM Andreas Rossberg <rossberg at google.com> wrote:

The "ability to do sound static analysis" is not a binary characteristics. You can do analysis on JS. With strict mode you have a couple more invariants, so can do slightly better, but from my perspective it's not even close to a game changer.

Agreed (static analysis approximates runtime, so the ability to do it is of course not binary -- many trade-offs).

Of course, no static analysis can be complete for all programs in a Turing complete language.

At the time it was being debated, we couldn't assume closure integrity if a parameter could alias eval which made static analysis of even simple programs really really hard. It looks like both violations of closure integrity via indirect aliasing of eval got rolled into non-strict mode though. I was just confused.

From my memory of the meetings and online discussions, strict mode was not meant to make static analysis significantly easier. More important was enabling Caja (now SES) to "use strict" and do less work, static and at runtime. Implementation and legacy loopholes continue to confound such efforts :-).

Fair enough.

# Florian Bösch (7 years ago)

On Tue, Jul 25, 2017 at 11:50 PM, Brendan Eich <brendan.eich at gmail.com>

wrote:

Core language changes are different in kind from sync touch events. It's very hard to plan to remove anything on a practical schedule or order-of-work basis. Engine maintainers likely still hate more modes, and users should too. New syntax as its own opt-in still wins, although this obligates TC39 to work on future-proofing, e.g., : after declarator name in declaration for type annotation syntax.

There's a point at which you cannot add anything new meaningful because of the broken things. And you can't remove the broken things because you're committed to eternal backwards compatibility. And you can't add modes because nobody likes them. That's just planned obsolescence. This means JS is not a living language, or won't be much longer in any case. It's probably best if whatever you run on the web ships its own interpreter that runs on whatever flavor runtime (JS, asm.js or Web Assembly) is available.

# Michał Wadas (7 years ago)

I know that's hard to remove features from the web. That's why I propose *clear and well defined *route to clean up language.

Browsers already "broke" the web many times. Java Applets are dead. ActiveX is dead (though some government websites still require it). Flash will be dead in few years. And some sites stopped working because of this.

Backward compatibility is a great thing that made Web successful. But no other environment have a goal to be backward compatible forever. Windows 98 .exe will probably not run on Windows 10. Old Android apps does not run on new devices. Very old iOS apps does not run on new devices. Why would anyone expect to run 20 years old code successfully on new browser? And why we limit it only to JavaScript code, if other Web APIs and behaviours were changed in past?

# T.J. Crowder (7 years ago)

On Wed, Jul 26, 2017 at 7:37 PM, Florian Bösch <pyalot at gmail.com> wrote:

This means JS is not a living language, or won't be much longer in any

case.

"Much longer" is of course entirely subjective, but let's not be too dramatic; I think we can count on JavaScript being a living language for at least another 10 years, regardless of what happens with WebAssembly and similar.

If WebAssembly (or similar) does stabilize, spread, and mature, that will enable a thriving ecosystem of languages that compile to it, making JavaScript only one of many (just as it is now outside of browsers). I love the language, but I love the idea of it being one of many that can target browsers even more.

And that will probably allow JavaScript itself more freedom at that point in terms of evolution, keeping it alive and healthy beyond its browser-limited existence.

-- T.J. Crowder

# Brendan Eich (7 years ago)
  1. There's no evidence (I'm sitting in a TC39 meeting) other than grumping from a few that we are near the point of JS painted into a corner by backward compatibility.

  2. WebAssembly is happening. Dynamic language support will take a while.

Together these suggest JS evolution will continue. It shall continue.

# Florian Bösch (7 years ago)

On Wed, Jul 26, 2017 at 9:00 PM, T.J. Crowder < tj.crowder at farsightsoftware.com> wrote:

keeping it alive and healthy beyond its browser-limited existence.

Many languages (including Python and Perl) concluded that at some point things have to be "cleaned up". The track record of languages that never cleaned up isn't... great. You could consider things like RPG, Cobol, Fortran, etc. "alive" because they're still used. But in any other sense of the word they aren't.

# Bruno Jouhier (7 years ago)

But no other environment have a goal to be backward compatible forever

JavaScript is often referred to as the "assembly language of the web". Did any instruction ever get removed from the x86 instruction set? Will this ever happen?

Also, modern C compilers still compile C code written in the 70s, even if they use obsolete syntax (pre-ansi K&R parameter declarations). Did features get removed from C? (I don't know the answer - anyone ever used trigraphs?). Is this a problem for C programmers?

There is an issue of scale here. You can impose an upgrade behind a corporate firewall; much harder in the open.

Bruno

# Florian Bösch (7 years ago)

On Wed, Jul 26, 2017 at 9:47 PM, Bruno Jouhier <bjouhier at gmail.com> wrote:

JavaScript is often referred to as the "assembly language of the web". Did any instruction ever get removed from the x86 instruction set? Will this ever happen?

There are some features of x86 which where ditched. The more well known example would be MMX (though the idea lives on in SSE/SIMD). But then there's also ARM and its slow crawl to replace x86, mainly in areas x86 didn't have much luck in capturing (mobile), but which now is increasingly making its way into nettops, and undoubdedly will eventually start to tackle high-end personal computing. In other areas (such as GPUs), vendors frequently toss support for features deemed obsolete (though many of them remain support in drivers software paths, just don't use those features cause the speed sucks).

# Bruno Jouhier (7 years ago)

There are some features of x86 which where ditched. The more well known example would be MMX (though the idea lives on in SSE/SIMD). But then there's also ARM and its slow crawl to replace x86 ...

MMX was not in the original set and looks more like an abandoned experiment (a bit like ES-4). But you are right, some obscure instructions got dropped stackoverflow.com/questions/32868293/x86-32-bit-opcodes-that-differ-in-x86-x64-or-entirely-removed (like ES dropping arguments.caller). ARM would be more like Dart (with a brighter future). Even if the analogy is not perfect, scale is the key factor in these phenomena.

# Brendan Eich (7 years ago)

On Wed, Jul 26, 2017 at 12:14 PM Florian Bösch <pyalot at gmail.com> wrote:

On Wed, Jul 26, 2017 at 9:00 PM, T.J. Crowder < tj.crowder at farsightsoftware.com> wrote:

keeping it alive and healthy beyond its browser-limited existence.

Many languages (including Python and Perl) concluded that at some point things have to be "cleaned up".

You have not addressed my points about the difficulty of removing things on the Web. Citing Perl and Python implies you want opt-in versioning, which I had proposed for ES4 back in the day. before I wised up ;-). A couple of points:

  1. Perl 5 vs. 6 and Python 2 vs. 3 are not great precedents. I think they are counterexamples even given the ability with Unix command line tools to be installed and used on single systems or in cloud settings without the huge coordination problem posed by the Web. Perl and Python have forked, and split their communities between versions. The Python rift may heal, but these forks have real and high costs. I know, we use Python at Brave in our build system.

  2. Opt-in versioning on the Web is an anti-pattern, as discussed at length on es-discuss. The better way, dubbed 1JS, is to let old forms fall into disuse while carefully extending the language with new syntax and semantics that compose well with the existing surface language, using a kernel semantics approach. This is still TC39's settled conviction as best foot forward.

The track record of languages that never cleaned up isn't... great. You

could consider things like RPG, Cobol, Fortran, etc. "alive" because they're still used. But in any other sense of the word they aren't.

Those languages forked and some modernized (I remember Fortran 77). Those are all quite a bit older than JS. I would also suggest they are for the most part stunning successes. We've learned a lot from them.

But the point of order here is whether JS can even be forked as Perl and Python have been. Another point to discuss is what you mean by "isn't... great." Aesthetics aside, keeping compatibility maximizes utility. There is risk of "painting into a corner", making conflicts in the kernel semantics or surface language over time, or just making a kitchen sink language. These are not malum in se but costs to be traded off for benefits.

If the aesthetic or Platonic ideal approach prevails, almost any successful language is not "alive" because it is messy. But that's false: C is still alive, C++ is quite alive, etc. I suggest being precise about costs vs. benefits and avoiding vague or counterfactual metaphorical judgments ("isn't... great", not "alive").

# Florian Bösch (7 years ago)

On Wed, Jul 26, 2017 at 11:41 PM, Brendan Eich <brendan.eich at gmail.com>

wrote:

Those languages forked and some modernized (I remember Fortran 77). Those are all quite a bit older than JS. I would also suggest they are for the most part stunning successes. We've learned a lot from them.

Yes, but we'll also want people to want to use a language. Not just use it because eons ago something has been written in them and now there is no way out. JS has to keep pace or it will end up like those languages, some relic from the past that nobody uses if they can possibly avoid it. I don't think the mission brief of JS can be "The best language you hate using but can't avoid using anyway."

# Brendan Eich (7 years ago)

Languages have warts, not just JS. No cleanup is perfect, and more warts come over time. If your point is merely about a "language you hate" but must perforce use on the Web, I think you should be happy right now. The solution is not to hate JS. It's not going to change incompatibly. Rather, you can use linters, "transpilers", compilers, voluntary unchecked subsets -- all possible today.

If you then object to having to use a tool or a subsetting discipline, I'm not sure what to say. The with statement is not forcing you to use it. Avoid it!

If you are concerned with the "painting into the corner" problem for engine implementors, the big ones are all in the room here and they can cope.

If you are concerned about JS pedagogy or marketing, the solution already practiced is to subset. Just as when teaching English or another evolved, irregularity-ridden living language.

# Brendan Eich (7 years ago)

On Wed, Jul 26, 2017 at 11:59 AM Michał Wadas <michalwadas at gmail.com> wrote:

I know that's hard to remove features from the web. That's why I propose *clear and well defined *route to clean up language.

Instead of asserting in bold, why not answer the questions I posed in reply to your clear but incomplete proposal?

Suppose TC39 said "with" was going away in 2027. Who among content owners, developers they hire sporadically, or browser users visiting their sites would do anything, and why would they do it? If a browser in 2027 ships without "with" support ahead of other major browsers, what happens to its support costs and market share?

Browsers already "broke" the web many times. Java Applets are dead. ActiveX is dead (though some government websites still require it). Flash will be dead in few years. And some sites stopped working because of this.

You are citing proprietary plugins. The Web of which JS is a part is defined by open standards from Ecma, WHATWG, W3C, IETF. We survived plugins dying (and good riddance, in general; credit to Flash for filling gaps and still doing things the standard Web cannot do well -- this is to the shame of the Web, no argument).

Ok, so proprietary or not, plugins died and that has costs. But they are borne by sites who dropped those plugins, one by one. They are not imposed (at least not till Brave, or now with the plan to kill Flash by 2020 among Adobe and the big four browsers) from the client side. Again, the browser-market game theory does not work. Please respond to this clear and well-defined point :-P.

# Florian Bösch (7 years ago)

On Thu, Jul 27, 2017 at 12:18 AM, Brendan Eich <brendan.eich at gmail.com>

wrote:

The solution is not to hate JS. It's not going to change incompatibly. Rather, you can use linters, "transpilers", compilers, voluntary unchecked subsets -- all possible today.

So basically "the best way to use JS is to not use JS". Awesome.

# doodad-js Admin (7 years ago)

I feel like a reticence on starting a new version of JS. After all, JS is only at version 1 and has survived at least 20 years. Every software, including languages, are having major versions with breaking changes. And I don’t see why JS should be an exception. That just would be great to make a big cleanup, stabilize things, make real classes, a real type system (dynamic or static) ... But maybe that’s reserved for another new Web language after all.

From: Brendan Eich [mailto:brendan.eich at gmail.com] Sent: Wednesday, July 26, 2017 6:19 PM To: Florian Bösch <pyalot at gmail.com>

Cc: T.J. Crowder <tj.crowder at farsightsoftware.com>; doodad-js Admin <doodadjs at gmail.com>; es-discuss <es-discuss at mozilla.org>

Subject: Re: FW: Removal of language features

Languages have warts, not just JS. No cleanup is perfect, and more warts come over time. If your point is merely about a "language you hate" but must perforce use on the Web, I think you should be happy right now. The solution is not to hate JS. It's not going to change incompatibly. Rather, you can use linters, "transpilers", compilers, voluntary unchecked subsets -- all possible today.

If you then object to having to use a tool or a subsetting discipline, I'm not sure what to say. The with statement is not forcing you to use it. Avoid it!

If you are concerned with the "painting into the corner" problem for engine implementors, the big ones are all in the room here and they can cope.

If you are concerned about JS pedagogy or marketing, the solution already practiced is to subset. Just as when teaching English or another evolved, irregularity-ridden living language.

# Tab Atkins Jr. (7 years ago)

On Wed, Jul 26, 2017 at 3:37 PM, Florian Bösch <pyalot at gmail.com> wrote:

On Thu, Jul 27, 2017 at 12:18 AM, Brendan Eich <brendan.eich at gmail.com> wrote:

The solution is not to hate JS. It's not going to change incompatibly. Rather, you can use linters, "transpilers", compilers, voluntary unchecked subsets -- all possible today.

So basically "the best way to use JS is to not use JS". Awesome.

That's the downside of shipping your programs to customers as source, and letting them use any of 100+ compilers of varying ages and quality to compile your code. (There's plenty of upsides, of course.)

As Brendan said, examples of other languages don't really apply, because they compile on the developer end, and just ship binaries to customers. (Or something that has the same effect, like shipping source+interpreter to customers in a package.) If you want to benefit from those network dynamics, you have to compile on your end, or in the language of today, "transpile".

That doesn't mean "not use JS" - Babel and related projects let you use modern JS, and you can apply whatever restrictions you want. Or you can go ahead and abandon JS, and use one of the myriad of alternative transpilation languages. Whatever floats your boat.

But you can't get around the mathematics. Delivering plain source, without a well-controlled compiler monopoly, means breaking changes are very, very hard to make. Best to make peace with it and engineer around it, rather than futilely fight it.

# Michael Kriegel (7 years ago)

I read this discussion for a long time and I do not see anything which helps anyone...

I see TC39 members/supporters who say, there is no issue for them still having the old features in place:

Brendan Eich (TC39): "There's no evidence (I'm sitting in a TC39 meeting) other than grumping from a few that we are near the point of JS painted into a corner by backward compatibility."

Andreas Rossberg (google): "As for the reoccurring assumption that deprecation would help simplifying JavaScript implementations: no, not to a relevant degree (...) And clearly, modes or versions only make things worse in that regard."

Can't we agree on the following:

"As long as TC39 members do not feel being painted into a corner because of backwards compatibility and as long as browser vendors do not indicate having trouble maintaining the old features and as long as those old features are not security risks by design, there is no need to discuss further about the removal of language features."?

As a developer, "a user" of JavaScript I have no problem with features around, which I do not use. If there are features a group of people (and even if it were the whole JS developer community) agrees to be evil, they can agree not to use them. And as a developer using JavaScript I am thankful for the great work the TC39 and browser vendor guys do to keep this all rolling. And if they say one time, that they (for a good reason) have to abandon a feature, which I used and maybe even liked, I would spend all time necessary on making my software work without it. That being said I see no value in us developers discussing about removing old features which we just do not like.

# Isiah Meadows (7 years ago)

I agree. The only people who really have a stake in this discussion apart from committee members are implementors, and trust me: they really don't like having to support features deprecated for over a decade.

My only question at this point is: would it be possible to emit deprecation warnings for some features, so it would be easier to remove some of the legacy bloat? (example: RegExp.$1)

# Michael Kriegel (7 years ago)

Maybe TC39 should think about a deprecation plan, which includes rules for fairness between browser vendors. For example, if the feature RegExp.$1 shall be removed. Then:

  1. At date X, the feature gets marked as deprecated.

  2. Within 6 Months from X, all browser vendors must have taken this feature up in their deprecation list and show a deprecation warning in the console.

  3. At a fixed date (e.g. 12 Months after X) all browsers must show a warning to the user (e.g. red address bar, etc.), when the website he visits uses a feature from the deprecation list: "The website you are visiting uses features, which will be removed in the future. Please ask the website owner to update his website." - All browser vendors are obliged to start this warning beginning with that date - so the browser has to check for the date.

  4. At a fixed date (e.g. 24 Months after X) all browsers must stop supporting the feature, which means that they just refuse to show that broken website and instead show a message to the user, that the website cannot be shown anymore, because its features are not supported anymore.

  5. All browser versions released after that must have the feature permanently disabled or removed.

Authors of Websites, for which there is still interest, will update their code. Other websites will just break and nobody will care. Companies using browser based internal tools may decide to stick with older browser versions for that purpose, which is totally fine, it's their risk (security holes, etc.?)

(Optional idea at step 2: If the website author enabled it, the browser tries to send a deprecation warning to deprecation-warning at whatever.ch where whatever.ch is the domain of the website the user visited. Or maybe this should be communicated to the web server which may then send a message itself)

So I do not see a risk of "breaking the web" when there is such a clear plan set up. There would be just the question how browser vendors could be punished, if they do not comply and try to get an advantage over other browsers by continuing support of those old features...? Maybe search engine developers could agree on degrading websites which use deprecated features after point 3 in time, which would reduce the interest of people in that web site and increase the will of the website owner to improve.

This was just thinking out loud... I will stick to every decision TC39 makes about language feature removal.

# Bill Frantz (7 years ago)

On 7/26/17 at 3:18 PM, brendan.eich at gmail.com (Brendan Eich) wrote:

If you are concerned about JS pedagogy or marketing, the solution already practiced is to subset. Just as when teaching English or another evolved, irregularity-ridden living language.

The real problem with bloat is reading code, not writing it. However, if a reader can easily look up a nearly abandoned, but still supported, construct to find out what it does and what footguns it includes, that situation is probably better than having some unmaintained web site fail when a new version of a browser comes out that no longer supports that construct.

YMMV - Bill


Bill Frantz | Re: Hardware Management Modes: | Periwinkle (408)356-8506 | If there's a mode, there's a | 16345 Englewood Ave www.pwpconsult.com | failure mode. - Jerry Leichter | Los Gatos, CA 95032

# Isiah Meadows (7 years ago)

Sounds good, except for a few nits:

  1. Engines will know ahead of time when the deprecation will start, as it'll likely be in the draft 6 months or more before then.
  2. It's one thing to print a deprecation message. It's another to alert the end user for something that shouldn't matter. It's not a security issue like using compromised certificate algorithms (getting people of MD5 was a nightmare), so it shouldn't be worth alerting the user over it.
  3. Requiring browsers to stop implementing something is as easy as making it a forbidden extension (like what was already done with arguments.caller). I'm not sure we would need anything other than a formal document stating that practice is a thing.

Also, keep in mind, some things are much more difficult to remove (e.g. with statement) than others (e.g. Function.prototype.callee). So it may be necessary to 1. make logging optional with "should", and 2. call the relevant hook with something to denote what's being deprecated, so implementors/embedders can avoid logging things that would end up super spammy in their particular use case. For example:

  • Things like String.prototype.bold are pretty safe to just remove in server runtimes due to being mostly useless, but not so much in browsers, where it is occasionally used.
  • It's harder to remove with in servers than in browsers due to templating based on code gen using it being more popular there (like Lodash templates), and the lack of need to precompile for startup performance.
# Bruno Jouhier (7 years ago)
  1. At a fixed date (e.g. 12 Months after X) all browsers must show a warning to the user (e.g. red address bar, etc.), when the website he visits uses a feature from the deprecation list: "The website you are visiting uses features, which will be removed in the future. Please ask the website owner to update his website." - All browser vendors are obliged to start this warning beginning with that date - so the browser has to check for the date.

My step mother calls me: Bruno, there is a strange message on my screen. Can you help? I reassure her.

  1. At a fixed date (e.g. 24 Months after X) all browsers must stop supporting the feature, which means that they just refuse to show that broken website and instead show a message to the user, that the website cannot be shown anymore, because its features are not supported anymore.

My step mother calls again: my tablet is broken, can you fix it?

The web site that my step mother was visiting was built by the non-profit accountant's nephew eight years ago. He is climbing a mountain.

We see things with our technologist eyes. Many (most?) web users don't understand whether something is wrong with the browser or with the server, or with the network (and they don't care). For them it is just "broken".

We've broken the web. My step mother is happy with her apps.

# Mark (7 years ago)

I've been following this thread since it started. Maybe it's in here somewhere and I missed it throughout these many comments. But...

It has already been mentioned that there is likely no performance degradation when adding new features. If performance is an issue because a developer is using legacy/outdated methods in their code, the developer should just update their methods to use the newer better ones in the newer API that was introduced.

Alternatively, JS has done a good job of adding new features that are optional and that do not affect outdated ones, so there is nothing lost here. If you don't want to use a new feature, don't use it... ever (if necessary).

So can someone give me at least a couple of hard use cases where introducing a new JS feature requires the removal of another? And not removing the feature, would cause significant harm to the future of the language? I don't mean just some anecdotal or insignificant case like having to choose a different reserved word because it was already used before (i.e. will we run out of words? :) )

AFAICT, the committee is working extremely hard to introduce some pretty new and exciting things to JS every day tc39/proposals.

And I don't think this progress would be improved that much by removal old JS features. So I'm seriously having trouble understanding the assumption that we need to remove JS features just to move the language forward. ​

# Andreas Rossberg (7 years ago)

On 27 July 2017 at 11:00, Mark <mark at heyimmark.com> wrote:

It has already been mentioned that there is likely no performance degradation when adding new features.

That is not always true. For example, ES6 has caused some notable performance regressions for ES5 code initially, due to extensions to the object model that made it even more dynamic. The new @@-hooks were particularly nasty and some cases required substantial amounts of work from implementers just to get back close to the previous baseline performance. Parsing also slowed down measurably. Moreover, many features tend to add combinatorial complexity that can make the surface of "common cases" to optimise for in preexisting features much larger.

# Mark (7 years ago)

Yeah, but performance issues are different from not being backwards-compatible.

I think it is a mistake to assume that a developer has a right to always have optimal performance without requiring anything to get it. It almost sounds like you're saying that there should be no cost to the consumer for choosing not to evolve with the language.

Things we buy into in life (not just a coding language) will depreciate in value and will require either an upgrade or a replacement or significant maintenance and, if not done, the consumer will suffer the consequences of choosing to remain stagnant. And the longer the stagnation, the greater the change needed to put the consumer in the same (or better) position they were in before the depreciation got so bad.

That said, I'm still struggling to see a real need to remove older JS features.

# Andreas Rossberg (7 years ago)

On 27 July 2017 at 12:07, Mark <mark at heyimmark.com> wrote:

I think it is a mistake to assume that a developer has a right to always have optimal performance without requiring anything to get it. It almost sounds like you're saying that there should be no cost to the consumer for choosing not to evolve with the language.

In an ideal world, I would agree. But that's not how the game theory works out on the web. What Brendan already pointed out wrt breaking sites also applies to performance regressions: if Joe User observes that a website suddenly got notably slower with a new version of their browser then they will blame the browser. That effect is further elevated by tech reviews often performing measurements with hopelessly outdated benchmarks. So no browser vendor can afford significant regressions, unless they have an urge to look bad in public perception.

# kai zhu (7 years ago)

On Jul 27, 2017, at 5:43 PM, Andreas Rossberg <rossberg at google.com> wrote:

That is not always true. For example, ES6 has caused some notable performance regressions for ES5 code initially, due to extensions to the object model that made it even more dynamic. The new @@-hooks were particularly nasty and some cases required substantial amounts of work from implementers just to get back close to the previous baseline performance. Parsing also slowed down measurably. Moreover, many features tend to add combinatorial complexity that can make the surface of "common cases" to optimise for in preexisting features much larger.

I’ve noticed chrome 59 freezing more when initially loading pages. Maybe its due to performance-penalty of extra parser complexity, maybe not. Also, the chrome-based electron-browser has gotten slower with each release over the past year, when I use it to test mostly es5-based browser-code. Can’t say about the other browser-vendors as I don’t use them as much.

# T.J. Crowder (7 years ago)

On Thu, Jul 27, 2017 at 12:55 PM, kai zhu <kaizhu256 at gmail.com> wrote:

I’ve noticed chrome 59 freezing more when initially loading pages. Maybe its due to performance-penalty of extra parser complexity, maybe not.

Much more likely due to the major changes in V8 v5.9: v8project.blogspot.co.uk/2017/05/launching-ignition-and-turbofan.html

-- T.J. Crowder

# Mark (7 years ago)

Reposting (with edits) because I accidentally sent this to only Andreas

if Joe User observes that a website suddenly got notably slower with a new version of their browser then they will blame the browser.

This is a rather large assumption to make and, at the same time, I don’t think it is true. When users go to a slow-loading website, I think it’s much more likely they’ll blame the website developer. If an application runs slow on your OS, you wouldn’t blame it on the OS vendor. Similarly, if an application I just upgraded runs slow on my mobile device, I wouldn’t automatically assume it's the phone manufacturer. ​

# Andreas Rossberg (7 years ago)

On 27 July 2017 at 14:23, Mark <mark at heyimmark.com> wrote:

if Joe User observes that a website suddenly got notably slower with a new version of their browser then they will blame the browser.

This is a rather large assumption to make and, at the same time, I don’t think it is true. When users go to a slow-loading website, I think it’s much more likely they’ll blame the website developer. If an application runs slow on your OS, you wouldn’t blame it on the OS vendor. Similarly, if an application I just upgraded runs slow on my mobile device, I wouldn’t automatically assume it's the phone manufacturer.

I'm talking about the situation were they upgrade the browser and observe that a known site runs slower afterwards than it did before. Nothing else changed. Of course they gonna blame it on the browser update, and correctly so.

# Boris Zbarsky (7 years ago)

On 7/27/17 2:02 AM, Michael Kriegel wrote:

  1. At a fixed date (e.g. 24 Months after X) all browsers must stop supporting the feature

How do you plan to enforce this?

Please note that the people representing browsers in this committee may not (and afaict generally do not) make ship/no-ship product decisions for their browsers, so the can't even credibly commit to what you suggest.

Authors of Websites, for which there is still interest, will update their code. Other websites will just break and nobody will care.

Unfortunately, you're wrong. That's because interest is asymmetric: users may have interest in a site even if the author does not. So it's quite possible (and in fact has happened before) that sites will not be updated, they will break, and users will in fact care.

This happens all the time, even with well-advertised multi-year deprecations, well publicized cutoff times and large companies that have the resources to update their sites if they want to. See the story of Google Hangouts, for example.

So I do not see a risk of "breaking the web" when there is such a clear plan set up. There would be just the question how browser vendors could be punished, if they do not comply and try to get an advantage over other browsers by continuing support of those old features...?

Good luck with that.

# Mike Samuel (7 years ago)

On Thu, Jul 27, 2017 at 1:33 AM, Isiah Meadows <isiahmeadows at gmail.com> wrote:

I agree. The only people who really have a stake in this discussion apart from committee members are implementors, and trust me: they really don't like having to support features deprecated for over a decade.

I am not an implementor, but I would like to restake my claim as a security practitioner that I made in

esdiscuss.org/topic/removal-of-language-features#content-7

My only question at this point is: would it be possible to emit deprecation warnings for some features, so it would be easier to remove some of the legacy bloat? (example: RegExp.$1)

Linters were mentioned earlier as a mechanism for this. The problem, as Brendan pointed out, is that you have to get the usage rate very low or have browsers coordinate closely on breaking the holdouts.

# Allen Wirfs-Brock (7 years ago)

On Jul 26, 2017, at 11:02 PM, Michael Kriegel <michael.kriegel at actifsource.com> wrote:

Maybe TC39 should think about a deprecation plan, which includes rules for fairness between browser vendors. For example, if the feature RegExp.$1 shall be removed. Then:

  1. At date X, the feature gets marked as deprecated.

  2. Within 6 Months from X, all browser vendors must…

TC39 has absolutely no authority to tell browser vendors (or anybody else) that they must do something.

All TC39 can do is publish specifications that say what a conforming implementation must do. It is completely up to implementations to choose to conform or not.