Javascript Code Churn Rate?
In case you happen to be unaware, the V8 team recently came out with ideas about a “stricter mode” that cuts down
Thanks Nelo! Yes, I've seen strong mode and I think it's an interesting idea (though it trades a bit more usability for performance than I would have personally). Still though, I'm curious to see the js churn data if anyone has it, as that effects all approaches in the strong mode vein.
On 11/10/15 7:41 AM, Ethan Resnick wrote:
And how long until they could remove support for the rest of the language altogether?
This makes the fundamental assumption that it's OK to break old things just because they're old. To the extent that the web is used for applications, this is probably OK, but for documents this is really a bad approach because we (well at least some of us) want those to continue to be readable as the web evolves. Otherwise we end up with a "dark ages" later on where things that appeared in print continue to be readable while later digital stuff, even if still available, is not.
And in this case "documents" includes things like interactive New York Times stuff and whatnot...
With to breaking old code, even breaking 0.001% of sites is far too
many for JS. Believe it or not, for similar reasons, proto was
un-deprecated and standardized. And contains
was changed to includes
,
because a library popular in the past broke, and a significant number of
their users don't keep their version up to date.
To the extent that the web is used for applications, this is probably OK,
but for documents this is really a bad approach because we (well at least some of us) want those to continue to be readable as the web evolves.
Sure, I can appreciate that. And the academic/researcher in me definitely likes the idea of never removing a language feature.
I guess I was just asking in case anyone felt there could be some (very, very low) level of breakage that's tolerable. After all, links/images already go bad pretty regularly and removing bits of JS wouldn't make the web the only medium for which old equipment (here, an old browser) is required to view old content. On that front, print is the remarkable exception; most everything else (audio recordings, video recordings, conventional software) is pretty tightly bound to its original technology. Of course, "other mediums suck at longevity too" isn't much of an argument, but if there's a tradeoff here, maybe it's worth keeping in mind.
Regardless, it seems like there are many less radical approaches that deprioritize old features without making them strictly unavailable, so I'm still curious to know about JS churn rates, if that data exists, to get a sense of the timescale for those approaches.
I'd like to chime in here as this is a pet peeve of mine.
In general, I'd say that the ECMAScript working group and engine vendors have done a better job at handling this than the other Web-related technologies groups (in particular, the DOM working group). In specific, I'm thinking of how the ECMAScript group handed 'non-strict' / 'strict' vs. the DOM group's heavy-handed attempt to change Attributes to not be Nodes and the fallout from that for me and my company's product.
The core problem, in my opinion, stems from this misguided attempt to 'telemeter' the Web and then use that as justification for removing features. This is all fine and dandy if you only ever live in the 'Internet world', where folks do monthly, if not daily, releases but many of us don't. We are building Web apps (lots and lots of Web apps) for enterprises who have a firewall that your telemetry will never measure. This code was built long ago, its developers donned their white hat and rode into the sunset a while back and it's expected usable lifetime is measured in years, not months.
When asked by these developers when I expect that they can remove these features, my answer is: "think years... maybe decades." We're working in environments where we're replacing mainframe systems that were written when Jimmy Carter was president (apologies to non-US citizens here, that would've been in the late 1970's). The customers we're doing the work for expect that the "new" systems are going to last as long, unrealistic though that may be.
Here's another way to think about it: Java API evolution speeds.
My 2 cents.
For clarification in my earlier message, when I say "asked by these developers", I'm speaking of members of browser development teams.
I've been trying to think through possible ways to address JS's growing complexity (something I know I'm not alone in worrying about) that are consistent with "don't break the web". I understand going in that the solution will largely lie in controlling future growth rather than removing existing features, which will always be hard and is currently near impossible. Still, I feel like deprecation/subsetting approaches might not have been adequately explored.
Before I go on proposing things without knowing what I'm talking about, though, I was hoping y'all could point me to (or help me by collecting?) some relevant data. In particular, I'm wondering: what's the distribution of the age of js files on the web, accounting for the frequency with which each page is visited? Or, more concretely: suppose you could magically get all new/newly-modified JS to only use a particular subset of the language; how long would it take for that subset to dominate the web, such that engines could heavily optimize for it? And how long until they could remove support for the rest of the language altogether?
Cheers, Ethan
P.S. Long time es-discuss lurker and I really admire all the great work you folks do here.