-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Polymorphic Math methods, backwards compatibility, and polyfills #17
Comments
An additional concern with changing the existing methods is that it puts transpilers/polyfillers in the awkward position of needing to defensively clobber all these builtin functions with bigint-aware versions. These polyfills need to be included even if nothing directly references bigint in the code, since an external API could produce bigints (e.g. This is made worse because any usages of |
Thanks for the comments! 😄
This situation requires two things to be simultaneously true:
I would be quite surprised if the first condition were true in any real-world codebase, and I would be even more surprised if both the first and second conditions were simultaneously true in any real-world codebase (actually breaking something). Having said that, I do accept that someone theoretically might have done something weird and brittle… But this is the same as extending web APIs with new type overloading. Many web APIs’ functions have been extended to accept broader inputs without web-compatibility problems. Examples include (thanks @annevk):
Such function-input changes have been generally considered “web compatible”. I don’t think there is something unique about extending Math functions in this manner that is different than extending web-API functions in this manner. So I would explicitly consider this risk to be small, that the ergonomics (avoiding yet more globals for BigMath and DecMath) to be worth this small risk, and that this is in keeping with precedents set by many previous changes to web-platform APIs’ functions. I will add this to the explainer when I have time. However, if any browser team remains concerned about web compatibility, then use-counter data would be welcome, of course.
I would imagine that any polyfilling would continue to be opt-in. Libraries generally advertise any dependencies on language features they have, e.g., “This library requires promises.” I’m not sure how extending Math would be different—if a codebase has a dependency on a library that requires Promise, then it needs to include a Promise polyfill; if it has no such dependency, then it does not need to include a polyfill. The same would go for BigInt Math: “This library requires BigInt Math.” A codebase that has no dependency on a library requiring BigInt Math would not need a BigInt Math polyfill.
BigInts were added to the language after Thanks again for the comments! 😄 |
It is always the case that language builtins may stop throwing exceptions; code that relies on those exceptions is brittle. If you want to throw on BigInts reliably, you must explicitly check that; that’s always been the case for any types. |
@shicks every single language change likely requires that; as one of the prominent polyfill authors, that’s not a burden, that’s just the way it works. |
Slow rollout of engines with BigInt support (still only ~90%! Who wants to leave 10% of their potential users out in the cold? That's millions of people!), combined with lack of polyfillability of overloaded operators, are a reason why many developers still use libraries (e.g. these) instead of native BigInts. Overloading
|
I don’t see why it would make anything messier. It’s very easy to feature-detect whether a function works with BigInt or not, and the less-than-five authors that write these polyfills will handle that for everyone else. |
It’s certainly true that support for BigInts and BigInt Math will not reach >99% prevalence for many years, and that BigInt polyfills will be needed for a long time. But I agree with @ljharb—I’m not sure why monkey patching Math is different than monkey-patching the many web APIs that have already been also gradually broadened. As you say, any codebase that uses BigInts (but wants to accommodate the 10% of browsers that do not support them) cannot use a transparent polyfill anyway. (This is, of course, because the native BigInt API depends on syntax, like JSBI.multiply(JSBI.BigInt(-2), JSBI.add(JSBI.BigInt(x), JSBI.BigInt(1))) …with the intent to later be converted, years later, into: -2n * (BigInt(x) + 1n) However, the situation between transpiling to operators is little different from having to transpile to other functions like JSBI.multiply(JSBI.BigInt(-2), JSBI.abs(JSBI.add(JSBI.BigInt(x), JSBI.BigInt(1))) Years later, when the JSBI-using codebase gets transpiled to native BigInt syntax, calls to -2n * Math.abs(BigInt(x) + 1n) |
So to summarize what you just said: people won't be able to actually write I'm quite familiar with JSBI: I wrote it. And it's exactly what informs my opinion. Do you think that the whole situation around needing to use JSBI is desirable and should be repeated for more features? I think the contortions that people have to go through to use BigInts today, which are so much more cumbersome than most new JS features that can be polyfilled reasonably, are a lesson learned: don't spec future features in similar non-polyfillable ways. Monkey-patching comes into play once native BigInt support is sufficiently widely available that JSBI is no longer needed for emulating BigInts themselves (which brings a performance benefit). Monkey-patching builtins is widely considered bad practice for a variety of reasons (not the least of which: compatibility issues which then impede TC39's ability to standardize future features, as happened before), but there's nuance: So yeah, you can demand existence of |
@js-choi About opt-in polyfilling, we have found this to be unscalable. The library is the thing that knows what it depends on, so relying on the application (maybe transitively) depending on it to aggregate all these "extra dependencies" via some sort of documentation side-channel doesn't really work, for the same reason that NPM doesn't leave installing all the transitive dependency libraries up to the end user. We've had great success with Closure Compiler as a central point of detection for which polyfills (both of language syntax and of standard runtime library) are required, allowing library authors to add polyfill dependencies as seamlessly as they can add ordinary library dependencies, without any work required by application authors (which would be a significant blocker in a "one version" monorepo). While it may seem that Our hope has been that, once native BigInt adoption reaches 99.x% and the final "highest-requirement" service finally stops demanding support for pre-BigInt browsers, then (and only then) we'll be able to actually get rid of JSBI and start using native BigInts everywhere. But if that would introduce these polyfill requirements on Math, then that basically moves the goalpost and pushes the GA date for BigInt back another 3-4 years. |
The committee has repeatedly rejected polyfillability as a constraint on language design - much of the language post-ES6 would have been designed differently if maintaining polyfillability was a constraint, before and after BigInt. |
@ljharb It is surprising to me that as a Polyfill library maintainer that you aren't concerned by the performance implications of doing the BigInt polyfills for Math. There is no reason to overload the methods for BigInt, it doesn't help anyone. It just placates someones desire to not to have decide on a new namespace (or to limit the number of namespaces). Yes, the committee can not be constrained by polyfills, otherwise you can't add features like WeakMap, WeakRef or other new capabilities, but that doesn't mean the committee should not consider what the implication of those polyfills are to the ecosystem. |
@concavelenz i don’t agree those implications are particularly problematic, personally. Either way, it’s a better design for methods under “Math” to accept all mathematical data types. It very much helps me as a language user that Math methods work with all numeric primitives, and would hurt me to needlessly increase the separation between BigInt and Number (and potentially Decimal in the future) |
@ljharb The problem is that there already is a separation between BigInt and Number. The fact that Math and Number are two different namespaces is perhaps unfortunate, but as it currently stands, Math and Number operate only on numbers. That's a pretty clean invariant to maintain. Extending this to have BigInt methods operate only on bigints (and eventually Decimal methods operate only on decimals) is, I would argue, more consistent than a situation where the Math namespace is a grab bag of some functions that work on bigints and others than don't, and some can mix types but others can't. |
I think that consistency is a subjective and variable thing, and it’s highly inconsistent that something named Math only works with one of the numeric primitives. The separation between BigInt and Number isn’t complete; there are a number of operators that work on both. |
First, I’d like to express gratitude towards everyone bringing their experience and insight about backwards compatibility, especially from Google Closure Compiler and from JSBI. Your concerns are important issues: thank you for raising them, and thank for your patience in explaining them. This topic was originally about backwards compatibility and polyfillability. So I think that discussing what benefit polymorphic Math methods would have over separate methods probably belongs in another issue (i.e., #14). My mental model as a developer had always been that Math methods are conceptually a “extension” of the math operators, I expect many other developers to share this mental model (@ljharb and @sarahghp also seem to too), and polymorphic Math methods would match that mental model. Of course, as @ljharb says, consistency is subjective, but I think the benefits would be real. But, again, this part probably belongs more on #14. Back on topic: There are concerns about the transition period for a codebase after it switches to native BigInt primitives but still before it can use native polymorphic Math methods. A polyfill that monkey-patches the Number-only native Math methods would bring performance problems for all Number-only uses of the Math methods. And it would be difficult to keep the polyfill opt-in only; keeping track of the documentation of transitive dependencies is unscalable. These are all certainly true. These problems already have a solution from the web-API space, standalone polyfill implementations (non-shim polyfills). During this transition period, standalone implementations would provide their own BigInt-supporting math methods separately from monkey-patching the Math methods. We would discourage any monkey-patching or shims of Math. Eventually, many years from now, when native polymorphic Math functions are ready, the codebase would switch its calls from the standalone polyfill implementation to the native Math functions. That is: // During transition period during which native BigInts are available but native BigInt Math is not.
// Later, once the userbase is ready, all bigMax calls would be replaced with Math.max calls.
import { bigMax } from 'standalone-library-that-does-not-monkey-patch-anything';
bigMax(0n, 1n, 2n, 3n);
Math.max(0, 1, 2, 3); // Not affected by the library. I think that discouraging Math monkey-patching / shims and encouraging “standalone implementations” – temporarily until enough BigInt Math support – may address concerns about the performance of polyfilling polymorphic Math. Thank you all again for insight and patience in explaining your concerns. Hopefully, encouraging standalone implementations rather than monkey-patching or shimming should address concerns about performance during that transition period. [Edit 1: Replaced with alternative terminology as requested by @ljharb] [Edit 2: See also tc39/proposal-decimal#31. I do agree with @littledan and @ljharb, although I appreciate @jakobkummerow‘s points.] |
Math
methods, it isn't possible to tell whether code expects new or old behavior.
This comment has been minimized.
This comment has been minimized.
I think this is a wrong use of the word "inconsistent", but I mostly agree with the comment. I think another term would be "unintuitive", but there's the contradiction that having some methods work with But we haven't talked about bike-shedding, if Another point, the cognitive load of remembering which methods work with |
Suppose code was written with the knowledge that the
Math
methods will throw exceptions if givenBigInt
values.If the implementation of some methods change to allow
BigInt
values instead, that code will not behave as designed.If it were necessary to call a new method to get the new behavior, then the potential for this bug would be eliminated.
It would be good to explicitly consider whether it is better to incur this risk than to add a new method.
The text was updated successfully, but these errors were encountered: