Comparison operator in value proxies
I don't understand. What is "overwriting" an operator?
By overwriting I meant creating a trap for === operator. Sorry for confusion.
Adam
2011/4/17 David Herman <dherman at mozilla.com>:
Ah, sorry, I'd overlooked the subject line so I didn't realize you were talking about value proxies. I haven't digested the design space for value proxies, so I don't have anything to say about this yet. (Value proxies will not be in ES.next; they're still in a fairly early stage of design.)
On Apr 17, 2011, at 7:52 AM, Adam Stankiewicz wrote:
Hello everyone,
My idea is to disallow overwriting of === operator, and make 'compare' operator implement == instead. Why?
- === means for me that two variables have reference to the same object, so there is no point of overwriting it. Many programmers would be confused.
The reason === comes up with value types/proxies is the goal of supporting decimal and other number types, where 1.1m === something_computing_fresh_1_1m() must be true.
Decimal in IEEE754r also wants NaNm !== NaNm, and -0m === 0m.
- == always caused problems, above change gives it opportunity to return to grace.
I'm always in favor of adding ways for that which fell from grace to be redeemed over time, by developer choice, without breaking the old form if possible. This is a recurrent theme.
You don't know if they are the same object? Use ===. Don't know if these objects are the same, but only in eyes of programmer? Use ==. Think about == or < as semantical operators, and === as hard-cored is-the-same-object function.
The problem is objects in JS are reference types currently. Value types or proxies would be objects that, by virtue of being shallowly frozen, say, could be compared by reference or property values. This needs care in spec'ing since the shallow vs. deep freezing may not cover value types with deeper structures that nevertheless want to be compared by value. Deep comparison is possible of course, just more work.
The reason === comes up with value types/proxies is the goal of supporting decimal and other number types, where 1.1m === something_computing_fresh_1_1m() must be true.
If 1.1m was defined as singleton proxy, then === would work correctly, because both 1.1m and give_new_1_1() return the same proxy. Creating new proxy each time would cause memory overload, especially considering decimals. Additionally in JS something like 2 === new Number(2) returns false, so in my opinion it is not good idea, to let programmer define ===, even for decimal proxies. And if we let programmer create fresh 1_1m objects there would be no way to distinguish them, just because of trapped ===.
The problem is objects in JS are reference types currently. Value types or proxies would be objects that, by virtue of being shallowly frozen, say, could be compared by reference or property values. This needs care in spec'ing since the shallow vs. deep freezing may not cover value types with deeper structures that nevertheless want to be compared by value. Deep comparison is possible of course, just more work.
Exactly. === should be used only as reference-comparator (primitive types are somehow singletons). Creating trap for == gives programmer choice what type of comparison (even deep) use for his object.
adam.
On Apr 19, 2011, at 2:20 PM, Adam Stankiewicz wrote:
The reason === comes up with value types/proxies is the goal of supporting decimal and other number types, where 1.1m === something_computing_fresh_1_1m() must be true.
If 1.1m was defined as singleton proxy, then === would work correctly, because both 1.1m and give_new_1_1() return the same proxy.
Value types are supposed to be implementable in hardware if available. You are assuming they are objects with reference semantics. That contradicts their purpose and why we are discussing them as a new kind of type.
Creating new proxy each time would cause memory overload, especially considering decimals.
Not so. IEEE754r fits in 128 bits and that is efficiently passed and returned by value, compared to heap-boxing with copy on write or eager heap allocation and copying -- or as you seem to propose, memoization (which is a deal-killer in performance terms). Yes, literals should be interned or memoized. Not so every computed intermediate or final result!
Additionally in JS something like 2 === new Number(2) returns false,
Now you are comparing apples to oranges. "new Number(2)" is explicitly creating an object with reference type semantics. No one is proposing a "new Decimal('1.1')" that would do differently, but what we do propose is a literal 1.1m that does not heap-allocate a mutable object.
so in my opinion it is not good idea, to let programmer define ===, even for decimal proxies. And if we let programmer create fresh 1_1m objects there would be no way to distinguish them, just because of trapped ===.
So? Why do you need to distniguish 1.1 from compute_1_dot_1() today with IEEE754 binary double precision floating point? You don't.
Same goes for decimal.
The problem is objects in JS are reference types currently. Value types or proxies would be objects that, by virtue of being shallowly frozen, say, could be compared by reference or property values. This needs care in spec'ing since the shallow vs. deep freezing may not cover value types with deeper structures that nevertheless want to be compared by value. Deep comparison is possible of course, just more work. Exactly. === should be used only as reference-comparator (primitive types are somehow singletons).
You are assuming your conclusion. Value types are not reference types, please re-read the strawmen.
Creating trap for == gives programmer choice what type of comparison (even deep) use for his object.
That's fine but orthogonal to the === issue.
Value types are supposed to be implementable in hardware if available. You are assuming they are objects with reference semantics. That contradicts their purpose and why we are discussing them as a new kind of type.
I was talking about decimal proxies, not decimal primitives itself. Nothing prevents implementing === operator for decimal primitives, because it's happening on interpreter side. For backward-compatibility both Decimal() and new Decimal() would be proxies, and again there is not point of implementing === on programmer-side, because Decimal() would be singleton proxy, and new Decimal() would create "copy" of singleton proxy, so we get desired behavior: Decimal(1.1) === Decimal(1.1), but Decimal(1.1) !== new Decimal().
Creating new proxy each time would cause memory overload, especially considering decimals.
Not so. IEEE754r fits in 128 bits and that is efficiently passed and returned by value, compared to heap-boxing with copy on write or eager heap allocation and copying -- or as you seem to propose, memoization (which is a deal-killer in performance terms). Yes, literals should be interned or memoized. Not so every computed intermediate or final result!
Again, I was taling about proxies not primitives like decimal itself. Primitives are "singletons" by default. I meant the proxy for decimal object should be implemented as singleton (yes, in future JS we could use literals like 1.1m, but anyways it is not backward-compatible).
Additionally in JS something like 2 === new Number(2) returns false,
Now you are comparing apples to oranges. "new Number(2)" is explicitly creating an object with reference type semantics. No one is proposing a "new Decimal('1.1')" that would do differently, but what we do propose is a literal 1.1m that does not heap-allocate a mutable object.
1.1m would be synonym for Decimal('1.1'). 1.1m would return actual primitive (which is not backward-compatible), and Decimal('1.1') would return either frozen singleton proxy or in future actual, undistinguishable actual literal as 1.1m does.
so in my opinion it is not good idea, to let programmer define ===, even for decimal proxies. And if we let programmer create fresh 1_1m objects there would be no way to distinguish them, just because of trapped ===.
So? Why do you need to distniguish 1.1 from compute_1_dot_1() today with IEEE754 binary double precision floating point? You don't.
I meant distinguishing objects like Decimal('1.1') and new Decimal('1.1').
The problem is objects in JS are reference types currently. Value types or proxies would be objects that, by virtue of being shallowly frozen, say, could be compared by reference or property values. This needs care in spec'ing since the shallow vs. deep freezing may not cover value types with deeper structures that nevertheless want to be compared by value. Deep comparison is possible of course, just more work. Exactly. === should be used only as reference-comparator (primitive types are somehow singletons).
You are assuming your conclusion. Value types are not reference types, please re-read the strawmen.
I'm simplifying things, sorry. But anyways, currently value types and reference-objects behave the same as far as we talk about === operator. === does not look up object in any complicated way, just checks if reference or primitive value are the same. And that should stay.
I don't say I understood strawman entirely. Just saying that allowing programmers to create trap for === would be rather not brilliant idea. Actually, I'm going read strawman once more, right now :-)
adam.
On Apr 20, 2011, at 8:24 AM, Adam Stankiewicz wrote:
Value types are supposed to be implementable in hardware if available. You are assuming they are objects with reference semantics. That contradicts their purpose and why we are discussing them as a new kind of type. I was talking about decimal proxies, not decimal primitives itself.
The idea is to implement decimal (and rational, and complex, etc. etc.) via value proxies, not via built-in additions to the standard. I.e., "library code".
Thus there would be no difference between "decimal proxies" and "decimal primitives".
Hello everyone,
My idea is to disallow overwriting of === operator, and make 'compare' operator implement == instead. Why?
You don't know if they are the same object? Use ===. Don't know if these objects are the same, but only in eyes of programmer? Use ==. Think about == or < as semantical operators, and === as hard-cored is-the-same-object function.
Overwriting === operator would also probably cause problems in security (membrane design etc.), but it is only my guess.