Native JS Encryption
I've put some thought into this topic, and I have a few comments for you (see below).
On Fri, Mar 18, 2011 at 10:09 AM, Robert Accettura <robert at accettura.com>wrote:
I'll prefix this by saying I'm not entirely certain if this should be ECMA vs. HTML5 or dual track similar to the "Cryptographically strong random numbers"[1] idea floating around. I pitched the idea initially via a blog post[2] recently which got a lot more positive feedback than I expected. I'll just summarize the more important bits here:
I'd like to propose native cryptography support utilizing a simplified API for basic encryption/decryption. Something along the lines of: Crypto.AES.encrypt("foo bar", password);
Crypto.AES.decrypt(cryptString, password);
AES obviously being one example algorithm. I'd expect AES to
AES is a block cipher, and as such, to be used properly for encryption you also need to select a "mode of operation", and other parameters that are often poorly understood. Furthermore, as you point out below, any given block cipher or hash function may eventually be supplanted by a new standard (as new attacks are discovered, computing power increases, etc.). In general, I am opposed to the idea of exposing such primitives as the primary API for any crypto library, and I think most cryptographers would agree. Instead, the primitives should only exposed for users that absolutely have to do something custom or interoperable etc. The primary API should simply "do the right thing", and give you access to a symmetric key secure encoding/decoding procedure that implements a secure authenticated encryption mechanism. Ditto for public key crypto, although there things get a bit more complex. I think you would accomplish a lot more for security in practice, if you follow this approach.
eventually wane in popularity in favor of something new. SHA-256 and other hashing functions could be good as well. SSL is great for encrypting data in transit, but data in typically in transit for seconds at most but stored on the client (cookies, dom storage) or server for extended periods of time in plain text. This would also allow for serving some content encrypted from eavesdropping over http (assuming a shared key is known by both client and server). Encrypting data quickly on the client solves many problems.
I'm not convinced that this use case, or any other use case I could think of for the web (with the possible exception of DRM/encryption on streaming media), would really benefit from speed. On the whole, you would most likely be encrypting fairly small quantities of data, and any larger quantities of data would typically be handled in a context where the encryption/decryption can even be done asynchronously. Even on a mobile phone, a few blocks of AES could probably done in pure JS without noticeable delay. Have you done any experiments with SJCL ( crypto.stanford.edu/sjcl ) for example?
Shabsi
On Mar 18, 2011, at 12:56 PM, Shabsi Walfish wrote:
I'm not convinced that this use case, or any other use case I could think of for the web (with the possible exception of DRM/encryption on streaming media), would really benefit from speed. On the whole, you would most likely be encrypting fairly small quantities of data, and any larger quantities of data would typically be handled in a context where the encryption/decryption can even be done asynchronously. Even on a mobile phone, a few blocks of AES could probably done in pure JS without noticeable delay. Have you done any experiments with SJCL ( crypto.stanford.edu/sjcl ) for example?
In support of Robert's point, we have Firefox Sync [1], which client-side encrypts many blocks of user data (not just passwords; cookies, history, etc.) to hide it from our own (or an alternative; the server is open source) sync service.
This needs native speed, which we provide via privileged-JS-only (our so-called "chrome" user-interface JS) access to our native crypto module (NSS). The volume in blocks and bytes requires it. Using pure-JS crypto lowers performance an order of magnitude or two.
To your point about the API being "best, most current" crypto-standard (for a given key size, perhaps): that is usable but often in our modern era, JS clients must chat with JS server peers using precisely this or that crypto protocol. So I imagine we'll need both kinds of APIs: best-latest and exactly-this.
Mark Miller alluded to a crypto API task group without Ecma TC39. I'm open to it, provided we can include some of the domain experts who have participated on this list (but who may not be employed by Ecma members).
On Mar 18, 2011, at 4:53 PM, Brendan Eich wrote:
On Mar 18, 2011, at 12:56 PM, Shabsi Walfish wrote:
I'm not convinced that this use case, or any other use case I could think of for the web (with the possible exception of DRM/encryption on streaming media), would really benefit from speed. On the whole, you would most likely be encrypting fairly small quantities of data, and any larger quantities of data would typically be handled in a context where the encryption/decryption can even be done asynchronously. Even on a mobile phone, a few blocks of AES could probably done in pure JS without noticeable delay. Have you done any experiments with SJCL ( crypto.stanford.edu/sjcl ) for example?
In support of Robert's point, we have Firefox Sync [1],
[1] wiki.mozilla.org/Firefox_Sync [2] www.mozilla.com/en-US/mobile/sync
at least. Sorry for leaving these off last time.
On Fri, Mar 18, 2011 at 4:53 PM, Brendan Eich <brendan at mozilla.com> wrote:
On Mar 18, 2011, at 12:56 PM, Shabsi Walfish wrote:
I'm not convinced that this use case, or any other use case I could think of for the web (with the possible exception of DRM/encryption on streaming media), would really benefit from speed. On the whole, you would most likely be encrypting fairly small quantities of data, and any larger quantities of data would typically be handled in a context where the encryption/decryption can even be done asynchronously. Even on a mobile phone, a few blocks of AES could probably done in pure JS without noticeable delay. Have you done any experiments with SJCL ( crypto.stanford.edu/sjcl ) for example?
In support of Robert's point, we have Firefox Sync [1], which client-side encrypts many blocks of user data (not just passwords; cookies, history, etc.) to hide it from our own (or an alternative; the server is open source) sync service.
This needs native speed, which we provide via privileged-JS-only (our so-called "chrome" user-interface JS) access to our native crypto module (NSS). The volume in blocks and bytes requires it. Using pure-JS crypto lowers performance an order of magnitude or two.
I'm not convinced that you need native speed even for this (and its a bit specialized, since it lives in the browser chrome). It sounds like you are talking about a few MB of data, at most. Native code could probably do that in something like 10 or 20 ms, and even if you are slower by an order of magnitude you can do it in 200 ms in JavaScript (and the performance gap appears to be steadily decreasing at that). Either way, the sync time is probably dominated by network performance. I expect that approximately the same would hold true for mobile.
To your point about the API being "best, most current" crypto-standard (for a given key size, perhaps): that is usable but often in our modern era, JS clients must chat with JS server peers using precisely this or that crypto protocol. So I imagine we'll need both kinds of APIs: best-latest and exactly-this.
Hence you can allow for versioning and backwards compatibility, like most protocol APIs do.
Shabsi
On Mar 18, 2011, at 6:44 PM, Shabsi Walfish wrote:
On Fri, Mar 18, 2011 at 4:53 PM, Brendan Eich <brendan at mozilla.com> wrote: In support of Robert's point, we have Firefox Sync [1], which client-side encrypts many blocks of user data (not just passwords; cookies, history, etc.) to hide it from our own (or an alternative; the server is open source) sync service.
This needs native speed, which we provide via privileged-JS-only (our so-called "chrome" user-interface JS) access to our native crypto module (NSS). The volume in blocks and bytes requires it. Using pure-JS crypto lowers performance an order of magnitude or two.
I'm not convinced that you need native speed even for this (and its a bit specialized, since it lives in the browser chrome). It sounds like you are talking about a few MB of data, at most.
Sorry, no. I'm telling you our product requirements, not soliciting unquantified speculation. Users have tons of data (think all-tabs session histories). Users do not like waiting. We have to hide sync in the existing schedule, so megabytes do add up.
Native code could probably do that in something like 10 or 20 ms, and even if you are slower by an order of magnitude you can do it in 200 ms
I wrote "or two".
To your point about the API being "best, most current" crypto-standard (for a given key size, perhaps): that is usable but often in our modern era, JS clients must chat with JS server peers using precisely this or that crypto protocol. So I imagine we'll need both kinds of APIs: best-latest and exactly-this.
Hence you can allow for versioning and backwards compatibility, like most protocol APIs do.
Again, sorry: no. We are not sync'ing ES.next or ES.whatever to every protocol and crypto-protocol out there in any future epoch. We can't hope to guess. Instead we would decouple as current, hardcoded-in-C++, distributed in browsers modules do: provide certified and well-thought of algorithms.
Ideally it's all doable in JS at good enough perf. Reality is not there yet and won't be for years. Ask around inside Google :-P.
On Mar 18, 2011, at 9:44 PM, Shabsi Walfish wrote:
On Fri, Mar 18, 2011 at 4:53 PM, Brendan Eich <brendan at mozilla.com> wrote: On Mar 18, 2011, at 12:56 PM, Shabsi Walfish wrote:
I'm not convinced that this use case, or any other use case I could think of for the web (with the possible exception of DRM/encryption on streaming media), would really benefit from speed. On the whole, you would most likely be encrypting fairly small quantities of data, and any larger quantities of data would typically be handled in a context where the encryption/decryption can even be done asynchronously. Even on a mobile phone, a few blocks of AES could probably done in pure JS without noticeable delay. Have you done any experiments with SJCL ( crypto.stanford.edu/sjcl ) for example?
In support of Robert's point, we have Firefox Sync [1], which client-side encrypts many blocks of user data (not just passwords; cookies, history, etc.) to hide it from our own (or an alternative; the server is open source) sync service.
This needs native speed, which we provide via privileged-JS-only (our so-called "chrome" user-interface JS) access to our native crypto module (NSS). The volume in blocks and bytes requires it. Using pure-JS crypto lowers performance an order of magnitude or two.
I'm not convinced that you need native speed even for this (and its a bit specialized, since it lives in the browser chrome). It sounds like you are talking about a few MB of data, at most. Native code could probably do that in something like 10 or 20 ms, and even if you are slower by an order of magnitude you can do it in 200 ms in JavaScript (and the performance gap appears to be steadily decreasing at that). Either way, the sync time is probably dominated by network performance. I expect that approximately the same would hold true for mobile.
localStorage I believe today is 50 MB, but I think it's safe to say that will climb just like disk cache has been climbing. Even with SSD's disk space is relatively cheap. I don't think 99.99997% of users would even notice if that was raised to 100 MB. I think we could pretty easily fill 50 MB playing with something like <canvas/> and offering the ability to save like a Photoshop or Paint clone. Want to do something awesome like Gmail offline with your whole mailbox? More will be needed. That would be the perfect example of where this would be handy on the client side alone.
I think local data is right now at it's infancy. There are many reasons for an app to store data on the client (performance, offline support, privacy). Mobile is just further pushing this into the forefront. Users are concerned about privacy. Many vendors would love to be able to say they employ encryption without having to sacrifice performance.
On Fri, Mar 18, 2011 at 4:53 PM, Brendan Eich <brendan at mozilla.com> wrote:
On Mar 18, 2011, at 12:56 PM, Shabsi Walfish wrote:
I'm not convinced that this use case, or any other use case I could think of for the web (with the possible exception of DRM/encryption on streaming media), would really benefit from speed. On the whole, you would most likely be encrypting fairly small quantities of data, and any larger quantities of data would typically be handled in a context where the encryption/decryption can even be done asynchronously. Even on a mobile phone, a few blocks of AES could probably done in pure JS without noticeable delay. Have you done any experiments with SJCL ( crypto.stanford.edu/sjcl ) for example?
In support of Robert's point, we have Firefox Sync [1], which client-side encrypts many blocks of user data (not just passwords; cookies, history, etc.) to hide it from our own (or an alternative; the server is open source) sync service.
This needs native speed, which we provide via privileged-JS-only (our so-called "chrome" user-interface JS) access to our native crypto module (NSS). The volume in blocks and bytes requires it. Using pure-JS crypto lowers performance an order of magnitude or two.
To your point about the API being "best, most current" crypto-standard (for a given key size, perhaps): that is usable but often in our modern era, JS clients must chat with JS server peers using precisely this or that crypto protocol. So I imagine we'll need both kinds of APIs: best-latest and exactly-this.
Mark Miller alluded to a crypto API task group without Ecma TC39. I'm open to it, provided we can include some of the domain experts who have participated on this list (but who may not be employed by Ecma members).
"without"? I'm not sure what you mean by this, so I don't know if it's what I intended to allude to ;).
On Mar 18, 2011, at 9:31 PM, Mark S. Miller wrote:
On Fri, Mar 18, 2011 at 4:53 PM, Brendan Eich <brendan at mozilla.com> wrote: Mark Miller alluded to a crypto API task group without Ecma TC39. I'm open to it, provided we can include some of the domain experts who have participated on this list (but who may not be employed by Ecma members).
"without"? I'm not sure what you mean by this, so I don't know if it's what I intended to allude to ;).
Whoops -- typo'ed "within". Big difference!
Hope the remaining context pointed to the typo. I wouldn't worry about non-members attending otherwise.
On Fri, Mar 18, 2011 at 10:16 PM, Brendan Eich <brendan at mozilla.com> wrote:
On Mar 18, 2011, at 9:31 PM, Mark S. Miller wrote:
On Fri, Mar 18, 2011 at 4:53 PM, Brendan Eich <brendan at mozilla.com> wrote:
Mark Miller alluded to a crypto API task group without Ecma TC39. I'm open to it, provided we can include some of the domain experts who have participated on this list (but who may not be employed by Ecma members).
"without"? I'm not sure what you mean by this, so I don't know if it's what I intended to allude to ;).
Whoops -- typo'ed "within". Big difference!
Hope the remaining context pointed to the typo. I wouldn't worry about non-members attending otherwise.
Good. That's what I intended to allude to. I think a crypto task group could follow the same model that i18n is following -- towards stdization of a document separate from 262 but within tc39.
I agree about outside domain experts. In fact, I wish we could invite outside domain experts participate in all tc39 activities as we deem appropriate. I do not understand the rationale for bounding invited expert participation.
On Sat, Mar 19, 2011 at 10:09 AM, Mark S. Miller <erights at google.com> wrote:
I agree about outside domain experts. In fact, I wish we could invite outside domain experts participate in all tc39 activities as we deem appropriate. I do not understand the rationale for bounding invited expert participation.
I think this would be a good idea. If nothing else, providing "raw" crypto APIs can be a footgun, given the difficulties in actually using these ciphers and key management systems correctly.
Thomas Ptacek has a good post on this, and I've invited him to send me an elaboration that I'll forward to the group.
TL;DR, at the risk of my mis-sumarizing Thomas' excellent exposition: APIs like Google's Keyczar, which provide a more complete and harder-to-misuse set of capabilities, would likely be a better idea, and invite fewer missteps. They would not be simple to implement robustly, and neither Keyczar nor cryptlib are licensed liberally enough to be baked into all implementations. That's a sign that it's a hard problem more than that those are bad solutions, though.
Mike
On Mar 19, 2011, at 4:12 PM, Mike Shaver wrote:
On Sat, Mar 19, 2011 at 10:09 AM, Mark S. Miller <erights at google.com> wrote:
I agree about outside domain experts. In fact, I wish we could invite outside domain experts participate in all tc39 activities as we deem appropriate. I do not understand the rationale for bounding invited expert participation.
I think this would be a good idea. If nothing else, providing "raw" crypto APIs can be a footgun, given the difficulties in actually using these ciphers and key management systems correctly.
Thomas Ptacek has a good post on this, and I've invited him to send me an elaboration that I'll forward to the group.
TL;DR, at the risk of my mis-sumarizing Thomas' excellent exposition: APIs like Google's Keyczar, which provide a more complete and harder-to-misuse set of capabilities, would likely be a better idea, and invite fewer missteps. They would not be simple to implement robustly, and neither Keyczar nor cryptlib are licensed liberally enough to be baked into all implementations. That's a sign that it's a hard problem more than that those are bad solutions, though.
Are there any successful key based encryption schemes that have actually succeeded with "normals"? In my view when we look at GPG, PGP, the complexity was always the key to failure (pardon the pun, I couldn't resist). While I'm not opposed to something along those lines, I do think that the more traditional schemes should be considered though perhaps discouraged.
On Sat, Mar 19, 2011 at 2:45 PM, Robert Accettura <robert at accettura.com> wrote:
Are there any successful key based encryption schemes that have actually succeeded with "normals"?
TLS would be the obvious example, bitlocker and other encrypted file systems as well. We have hopes for the Firefox sync mechanism too, though we built our own cryptosystem to some extent, so...we'll see.
Most "normals" don't use crypto APIs of any kind, so I'm not quite sure what you mean.
In my view when we look at GPG, PGP, the complexity was always the key to failure (pardon the pun, I couldn't resist).
API complexity? That's the reason for things like Keyczar: they provide an API where the simplest thing to do is also the safest, and provide fewer places for people to slip up in mode selection, key management, etc. Crypto is hard, and even very experienced practitioners get it wrong a lot. Giving people raw AES/SHA-256/etc. is unlikely to lead to them building secure systems, though it will likely let them believe that they did.
Keyczar et alii are not a panacea: you still need to actually manage the environment, but they take away a lot of error surface, and remove the need for a lot of arcane mathematical knowledge.
While I'm not opposed to something along those lines, I do think that the more traditional schemes should be considered though perhaps discouraged.
I don't see the value of adding something that we immediately discourage people from using.
Mike
2011/3/19 Brendan Eich <brendan at mozilla.com>:
In support of Robert's point, we have Firefox Sync [1], which client-side encrypts many blocks of user data (not just passwords; cookies, history, etc.) to hide it from our own (or an alternative; the server is open source) sync service.
You want to protect the user from a compromise of Mozillas servers, but the JS code that does the encryption and the browser in which it is done are both served from those servers. I'm sure there are use cases for JS encryption but this doesn't sound like one of them. You can use SSL and encrypt on the server.
On 3/21/11 4:40 AM, Erik Corry wrote:
You want to protect the user from a compromise of Mozillas servers,
We also want to protect the user from a subpoena served to Mozilla, for example. This means we must never have the data on our side, and this means the encryption needs to happen on the client, period. This is not negotiable for proper functioning of the feature in question.
Just FYI, you are going to run into the problem of key portability. If the key is derived from a password, your encrypted copy of the user's data (which might be subject to subpoena?) could be easily cracked via offline dictionary attacks. I hope you at least plan to use a salt, many iterations of a good derivation function, etc. IMHO, users would be better off if you just t-of-n secret shared their storage across multiple hosts in different countries instead, but I can see why thats a challenge.
Shabsi
wiki.mozilla.org/Labs/Weave/Developer/Crypto
Let's get back to es-discuss, ok?
I'll prefix this by saying I'm not entirely certain if this should be ECMA vs. HTML5 or dual track similar to the "Cryptographically strong random numbers"[1] idea floating around. I pitched the idea initially via a blog post[2] recently which got a lot more positive feedback than I expected. I'll just summarize the more important bits here:
I'd like to propose native cryptography support utilizing a simplified API for basic encryption/decryption. Something along the lines of: Crypto.AES.encrypt("foo bar", password);
Crypto.AES.decrypt(cryptString, password);
AES obviously being one example algorithm. I'd expect AES to eventually wane in popularity in favor of something new. SHA-256 and other hashing functions could be good as well. SSL is great for encrypting data in transit, but data in typically in transit for seconds at most but stored on the client (cookies, dom storage) or server for extended periods of time in plain text. This would also allow for serving some content encrypted from eavesdropping over http (assuming a shared key is known by both client and server). Encrypting data quickly on the client solves many problems.
This could be useful beyond just web browsers, node.js comes to mind.
While encryption algorithms could be implemented in JS (and have been), doing so natively provides a boost as modern hardware is accelerated for certain algorithms such as AES NI[3] as well as removes the need for a library. As client side applications get more and more complicated and handle more and more data, especially in the mobile world where CPU and power consumption are key this would make a big difference.
Cite: