semicolon insertion for UseSubsetDirectives
# Waldemar Horwat (17 years ago)
As you suggested, the simplest solution is to make the semicolon after the use directive string literal mandatory.
Waldemar
As you suggested, the simplest solution is to make the semicolon after the use directive string literal mandatory. Waldemar Mike Samuel wrote: > How does the following program parse in the presence of ES3.1 > UseSubsetDirectives? > > "use strict" > + new Foo() > > Does semicolon insertion work after UseSubsetDirectives? Even if the > next token is an operator that can validly follow a string literal in > that context? > > Does it matter that the Foo instance's valueOf would be invoked with a > type hint of undefined under ES3, but with a type hint of 'number' under > ES3.1? > > > In another weird case: > > "use strict" > /foo,'/* blah() /**/ //' > > In ES3, this is the same as > ("use strict" / foo), "\x27\x2a blah\x28\x29 \x27\x2a\x2a\x27 \x27\x27" > but if a semicolon is inserted without regards for the following token > being an operator, the / starts a regexp, so it becomes the same as > "use strict"; > (/foo , \x22/) * blah(); > > > I think the difference in behavior in the first case is ignorable, but > the significant change in AST produced in the second provides a lot of > opportunity for security breaches. > Disallowing semicolon insertion after a UseSubsetDirective, so that the > tokenization is the same would solve that, and I think lint tools and > interpreter warnings can advise when a string token that looks like a > use subset is being ignored because a semicolon is lacking. > > mike
# Mark S. Miller (17 years ago)
[+es3.x-discuss]
(Please keep es3.x-discuss on the addressee list of messages relating to the ES3.1 spec.)
[+es3.x-discuss] (Please keep es3.x-discuss on the addressee list of messages relating to the ES3.1 spec.) On Thu, Oct 30, 2008 at 11:38 AM, Waldemar Horwat <waldemar at google.com> wrote: > As you suggested, the simplest solution is to make the semicolon after the > use directive string literal mandatory. > > Waldemar > > Mike Samuel wrote: >> >> How does the following program parse in the presence of ES3.1 >> UseSubsetDirectives? >> >> "use strict" >> + new Foo() >> >> Does semicolon insertion work after UseSubsetDirectives? Even if the next >> token is an operator that can validly follow a string literal in that >> context? >> >> Does it matter that the Foo instance's valueOf would be invoked with a >> type hint of undefined under ES3, but with a type hint of 'number' under >> ES3.1? >> >> >> In another weird case: >> >> "use strict" >> /foo,'/* blah() /**/ //' >> >> In ES3, this is the same as >> ("use strict" / foo), "\x27\x2a blah\x28\x29 \x27\x2a\x2a\x27 \x27\x27" >> but if a semicolon is inserted without regards for the following token >> being an operator, the / starts a regexp, so it becomes the same as >> "use strict"; >> (/foo , \x22/) * blah(); >> >> >> I think the difference in behavior in the first case is ignorable, but the >> significant change in AST produced in the second provides a lot of >> opportunity for security breaches. >> Disallowing semicolon insertion after a UseSubsetDirective, so that the >> tokenization is the same would solve that, and I think lint tools and >> interpreter warnings can advise when a string token that looks like a use >> subset is being ignored because a semicolon is lacking. >> >> mike > > _______________________________________________ > Es-discuss mailing list > Es-discuss at mozilla.org > https://mail.mozilla.org/listinfo/es-discuss > -- Cheers, --MarkM
How does the following program parse in the presence of ES3.1 UseSubsetDirectives?
"use strict"
Does semicolon insertion work after UseSubsetDirectives? Even if the next token is an operator that can validly follow a string literal in that context?
Does it matter that the Foo instance's valueOf would be invoked with a type hint of undefined under ES3, but with a type hint of 'number' under ES3.1?
In another weird case:
"use strict" /foo,'/* blah() /**/ //'
In ES3, this is the same as ("use strict" / foo), "\x27\x2a blah\x28\x29 \x27\x2a\x2a\x27 \x27\x27" but if a semicolon is inserted without for the following token being an operator, the / starts a regexp, so it becomes the same as "use strict"; (/foo , \x22/) * blah();
I think the difference in behavior in the first case is ignorable, but the significant change in AST produced in the second provides a lot of opportunity for security breaches. Disallowing semicolon insertion after a UseSubsetDirective, so that the tokenization is the same would solve that, and I think lint tools and interpreter warnings can advise when a string token that looks like a use subset is being ignored because a semicolon is lacking.
mike