-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to treat versions of the stdlib and external packages #371
Comments
Option B: Phase out old stdlibs as nimble pkgsHere's another alternative (let's call it Option B) that can prevent stdlibs named like A new comer who comes to nim should simply need do I am pasting my comment from Nim Forum here so that this discussion stays contained in this thread. About the versioning of stdlib modules, would this work (a different idea than what I brought up in the meeting):
To add to what @PMunch said, few more disadvantages of
Summary:
|
But then if I actually use the non-ugly |
You would have the minimum or exact nim version dep in .nimble. |
I don't agree, why would it be confusing? |
A new user would need to know that they need to import json3.. few years later, the then new user would need to import json5. Something about that doesn't feel right. |
Nobody really has to think about this either way, you cannot just guess a module name and get away with it, you need to read about what's available somewhere (the index, tutorials). And these resources tell you which names to import. |
And why exactly is this preferable? Meta information that is not part of your program directly is worse than what is accessible from the source code directly. |
This is what concerns me most -- honestly, it feels like all the arguments against So ... if I come up with a sugar rule like " |
If we are going down the route of allowing multiple version to coexist, we need to do the package manager do the version juggling; baking version numbers into package names doesn't make sense... it's not scalable. If this is allowed, we will have stdlib get populated with arbitrary packages with version numbers encoded into them.
It's more than that. The stdlib will gradually gather a lot of old baggage. It won't be a collection of "standard" libraries, but just a collection of all versions of modules. E.g. re and nre. |
Well ... it's still a "standard library", yes, it accumulates cruft, that's simply the nature of the beast. No, this doesn't mean a random collection of modules with trailing digits in their name (oh the horror! numbers! we should be scared!) |
We can embrace |
Python had urllib2 and it survived, https://docs.python.org/2.7/library/urllib2.html |
You misunderstood me. I gave the example of re and nre because that's what I fear will happen for json, json2, etc. We will just collect all the old modules, even the ones with inefficient algo in the stdlib. I didn't mean that re and nre are acceptable; it was the opposite. |
The proposal does not say how the versions are selected inside imported module itself. If we do when version(v2):
... ? |
I don't think so and the 'you' in my reply was a general 'you', don't take it personally. Just imagine we would have done that: Added regex, deprecated and then later removed 're' and 'nre'. Nobody would have complained, nobody would have made the connection to numbers and versioning and package management, everybody would have accepted it, Nim version X simply ships with a better regex library and the import is now |
Let's not complicate the module system with dependency resolution and other fun package management features. We already control module search paths during compilation, allowing us to select specific module versions without changing the language.
But you can't expect the user to have more than one version of the standard library installed. When you start increasing the amount of magic during compilation, to silently install more stdlib versions, you're setting the user up for some interesting debugging sessions.
Usability trumps aesthetics. When I see
Good. Backwards compatibility is very important for end users.
Confusing how? What can the user do instead of
Now that is confusing. So the Counterintuitive and therefore dangerous. |
We seem to keep coming back to this topic, that the standard library is a place where code goes to die without there being an process for cleaning up the debris, and therefore, the amount of bad code keeps accumulating, creating confusion and frustration. At the same time, it takes months to get critical security fixes into modules that need them, because it's hard to produce a working release when there are so many changes spread across the language, compiler and all the libraries - this creates an artificial dependency between all modules and the compiler which otherwise wouldn't have to be there, and which will keep slowing the pace of Nim development down over time. This is completely unsustainable. The really easy and simple solution is to not add more stuff to the standard library, and move the existing libraries into separately versioned packages. To get there, the following needs to happen:
There are a few more nice-to-haves, such as a vendoring option that will create a full, stand-alone release of the code. In using the standard library today, when we type The other thing we're seeing is that the old, fundamentally broken versions of the modules cannot be removed in any sane way - again, this is solved with packages: the unused modules fade from use and that's it, the historic versions continuing to exist as long as there's someone with a vested interest / enough skin in the game to keep it alive. In one fell stroke we also avoid the discussions around "but you should maintain this package for me". The flawed assumption here is that when typing "import json", it's the latest version I'm after - it's often not, in any application that works already, or in a library that is not central to the application - I want the "most working" version of the library in those cases and that means using whatever version was used during development and not changing it until there's an explicit decision to do so (and test that decision). The healthy way to treat package or module versions is that they are different libraries with similarities which in some cases end up being compatible, and when tested, can be upgraded together. |
|
on what-if decision-making, per request in chat |
I continue to fail to see the benefit in "instead of a stdlib, here is a list of Nimble packages". Not reviewed, using different styles and of different code qualities with different attitudes towards backwards compatibility. You always act like a fragile web of dependencies is a desirable thing to have and then you "only" need really good tooling to compensate for this design, but I think it's a terrible design to begin with. |
This is probably the point that's misunderstood the most in these discussions - there's absolutely no reason why the packages that are moved out of the "stdlib" would have lower quality or adhere to different standards - they would / could still be maintained by the community of
A mechanism to replace it is direly needed at this stage - one that doesn't break existing code that is happy with the status quo - but one that at the same time frees the community to move on and improve the situation, even if these improvements require changes to user code - at that point users have a choice: they can upgrade Old versions continue to work as long as changes to Nim don't break them, and when Nim gains features, it's easy to gate newer versions of the package to Nim versions that supports the prerequisite feature - like the introduction of Crucially, since the module is maintained by the Nim community, when a new, better version comes around that the curators of
It's the other way around - the current low standard of tooling is preventing a more dynamic development environment where releases of packages don't have to be artificially tied together to maintain a semblance of quality - the existence of the std lib is a crutch that compensates for the lack of adequate / minimum standard tooling and creates a, for the future, unsustainable bottleneck where the only thing people can imagine is to add more and more stuff to it - ie "module X in std lib is broken and needs module Y to fix it - let's add module Y to the standard lib also!". Imagine that in order to fix a bug in |
We already have each version of the stdlib on github through their respective branches, because of that we could do |
As inconvenient as it may be, "really good tooling" is what is expected from a programming language. It doesn't matter if a language is the fastest or most intuitive out there, if developers encounter too much friction attempting to set it up, build it, or use code developed by others, they are going to give up. |
It seems like the main complaints here are really about, "The overall quality and maintainability of the stdlib has become a burden on Nim development" rather than how to version packages. The whole issue of versioning is simply a symptom. It might be a good reason to consider a new major release for Nim, making it Nim 2.0 with the main change being the stdlib, somehow ensuring backwards compatibility with existing nimble packages (this is the tricky part). Give Nim a fresh start with a lean and mean stdlib, along with a stricter acceptance criteria for stdlib code addiitions. Say the new stdlib has no json module to start: that's fine because most agree the nimble packages are better, and in this plan they are still valid! If each stdlib module is diligently planned out, then users will move FROM nimble packages to stdlib modules. Having holes in the stdlib in the shape of JSON support, YAML support, native UI support, etc will allow the community to decide what is important to them. Nimble packages are made. The community essentially "votes" by using packages. Popular packages either inspire stdlib support efforts, or end up being directly rolled into the stdlib if they meet acceptance criteria and the author supports such a move. It may seem like a drastic step to take, but maybe it is necessary? In the modern day, does any Python dev speak of Python 1? Python 2 simply introduced some new data types, additional syntax, some new modules, some removed modules. It seemed like the greatest driving factor for Python 2 may have even just been a change in how development for Python itself was changing. Source: https://docs.python.org/3/whatsnew/2.0.html Not that we should believe that if Python did it, it must be good or correct. Simply pointing out that there may be a reasonable path forward by taking these learned lessons and wiping the slate clean. Given the tricky part of supporting Nimble packages from the previous major compiler version, this could be done if, similar to Python 2, the old stdlib was accessible somehow. This is where tooling comes in. Since nimble files already track the Nim version associated with them, when Nim2-compatible Nimble would see a Nim1 package, it knows the stdlib imports have a different root path. We can introduce deprecation warnings so Nimble package developers can update their packages to Nim2, directly reference the old stdlib, and begin to update to the new stdlib. This is fundamentally very similar to an option that was discussed at the last Nim dev meetup, when it was suggested that we make a "new stdlib that is only compatible with itself". This is basically going down that path, but asking for a specific effort made to not make the same mistakes with the new stdlib. Don't allow the maintainability of the new stdlib to degrade for the sake of having everything and the kitchen sink. |
In reality moving code around makes the problem worse -- we tried that with nim-lang/zip etc and more recently with 'fusion'. You need to ensure that the CI covers the Nim+fusion combinations that you actually try to support (N * M combinations where previously it was 1 * 1), you need to ensure that these modules show up in the generated documentation and when you ship them your users must understand to use the appropriate issue tracker. Then when a new language feature becomes available that the library would benefit from this library must migrate to a newer Nim compiler, so then only an older version of the library works with an older Nim compiler (or it uses the But this is all a bit besides the point. The original problem was "how do we get something like json2 into the stdlib?" Here are some possible options:
|
IMO, What happens if we need to break part of the API of What happens once everyone stopped using Overall, I generally agree with @arnetheduck : The stdlib should be treated as a dependency. As such, this becomes a dependency resolution problem (that we tries to solve using the module system, somehow). To put it differently, would we recommend a new Nimble package to be published every major version release ? |
If only there was a way to re-use code via templates, Nim's import-export mechanism, |
Aren't "important packages" already tested as a part of CI on every release/commit (i.e. we don't have |
They are tested yes, but only a single specific commit/version of every important package (well, usually "git master"). |
absolutely not - the whole point is to not have to release things together: the compiler can make a release whenever and the package can make a release whenever, independently - of course sometimes it makes sense to release together (nim compiler gets what changes is that at some point, the nim manual and other community resources start recommending other libraries and these will get picked up the same way the std lib libraries are picked up today - but it's a softer transition in which old code keeps working and existing users can upgrade at their leisure when it's worth it for them, not based on the need to make a Nim release, and conversely, it becomes a lot easier to make a package and/or Nim release to address critical issues.
only when the tooling to do so is inadequate - this is what tooling is for, to create an environment where these operations are not made artificially expensive. If you switch from nails to screws without a screwdriver, of course you'll be disappointed. Conversely, when all you have is the "add-to-stdlib-hammer", of course that looks like the only viable option - and more sadly, the path of least resistance.
Yes, of course it should be done, if there is an interest to do so, and that interest is driven by the people with skin in the game: the users of that version of the library - they can either persuade the maintainers to backport or do the work themselves - that's the mechanism that makes this a scalable solution. When it's a single package that focuses on one thing, it's also not expensive at all: you do the backport and you don't have to consider fixes to 100 other libraries to make the release with the backport - it's a much more tractable problem to fix a bug in json and release / backport it, than to navigate the history of a giant monolith and try to cut a release.
with the right tooling (monorepo support in PM), the packages can even live in the same git repo for all the package manager cares - this is convenient even, for certain groups of packages (the html parser and the web server).
that's the point a bit: you don't have to, most of the time, unless the rough consensus becomes clear. With decent tooling, the incentive to add things to the std lib is much smaller.
If
don't. spend the effort on upgrading the tooling, then make an incompatible separate package and start recommending it in manuals. The old |
The question was rhetorical; of course it's a terrible idea to release new package every version. Yet, I feel like this is what we're doing with |
Hmm, "monorepo support in PM", will give this a serious thought. |
Just on this point- Nimble supports this already. We even have a repo in nim-lang that utilises this, graveyard, it's done like this: https://github.com/nim-lang/packages/blob/master/packages.json#L12009. |
Yeah, I know. So let's make this concrete, how do we use it to bundle nim-regex with the next major Nim release? |
@Araq depends what your goals are. I'm guessing, but do you want the package to be bundled with Nim and also installable via Nimble? Why? |
I want to bundle it in the tarball/zipfiles because "just nimble install x" is not allowed in some environments, no matter how much we like Nimble. Plus the downsides (no git history) don't apply to the zipfile as it has no It shall be update-able via Nimble. |
the normal way is that the package manage has a For such bundling, the metadata must of course follow along so that the packages can be identified. Now, with good tooling, also this feature becomes much better for users: instead of having a standard library with lots of stuff they're not interested in, and lacking the stuff they are interested in, a Now, the really cool thing would be if the bundle command also included a dep on the language itself - when you bundle a project, you get the compiler with the package and it's treated like just one more library - this creates a fully reproducible zero-to-hero story rivalled by few out there. Of course, later the PM can optimize things and cache and so on, but that doesn't affect the mental / abstract model under which it operates. |
That's not the reason this should be done. The reason should be: we want this to be the officially supported way to do regex in Nim.
Unfortunately to support this properly non-trivial changes will be required. There may be simple "hacky" ways to accomplish this though and we might want to do so anyway. |
Yes, that too.
Please elaborate. |
So we can easily bundle a package with Nim:
The problem is that Nimble has no clue this exists, so if the user wants a newer |
Here is an idea: "nimble install regex" installs regex like it always does and there is a mechanism so that --nimblePath is prefered over |
the other question when bundling is what to do with packages that don't declare their dependency on |
My 2c: it doens't really matter if people don't use, say, a sanctioned JSON library, but mix and match them according to their need, as long as they are interoperable. So in my opinion the stdlib should ship datatypes for JSON (and HTTP, HTML, regexes, and so on) and never change them, while library authors should be encouraged to do whatever they please, but ensure that their libraries work with the official datatypes, so that one can, say, parse and HTTP request with lib1, add some headers to it with lib2 and finally call the request with lib3 |
I see this issue has not been talked a lot lately, is it considered for Nim v2? I think this would be a great way to allow standard library improvement. |
@mildred do you mean my original proposal? Or any of the other ideas put forth in this thread? AFAIK there hasn't been any consensus on what to do about imports. |
This is from the discussion during the 4th online Nim meetup, where stdlib module versioning was discussed. The common example used was the
json
module, so I will use that here as well.The scenario is that a new implementation of
json
comes along that we want to replace the old one in the standard library. This module has a different API and can't simply be replaced as a drop-in because it would break everyones packages. The worrying idea that seemed to have some traction was to simply name this new modulejson2
or potentially put all new modules instd2
so it would becomestd2/json
(I assume all the other standard library modules that depend onjson
would then be upgraded and placed instd2
). In my mind this is a horrible solution, and not only because of the extremely unsightly namesjson2
orstd2
. This is reminiscent of how C would do it, and in fact has, simply because it lacks the package manager and versioning that Nim has. The fact is that the Nim standard library is already versioned, by the Nim version. The fact that users have grown to expect that the standard library doesn't change much in-between versions is simply because it historically hasn't changed tremendously.My proposed solution is to simply allow imports to be versioned with the Nim standard library version they expect, otherwise it will use what the currently selected Nim version is shipping. The syntax could extend from the existing pragma syntax:
Or it could be expressed in a different way entirely. This has several benefits:
import json
would get you the current JSON module.json
module would not need to be maintained for new compiler versions (If someone wanted to maintain it they could create an external module, or they'd have to pin their module to use an older Nim version)The
json2
/std2/json
similarly has a lot of disadvantages:std2
because it depends onstd2/json
but itself wants to move to a newer API? Now we need to support the original, thestd2
version, and a newstd3
versionjson
module working on the latest compiler, or delete it and now only have ajson2
moduleOne critique of the
import json {.v1.4.6.}
syntax is that an unqualified import would pull the newest version. So if I have an external modulemymodule
that depended on the current JSON implementation it would potentially break when imported into a project that uses the new JSON implementation. However all Nimble modules already specify their required lowest Nim version. So this is no harder to solve that to assume that a Nimble module with a dependency on Nim >1.4.6 should also be served the 1.4.6 JSON module. This obviously requires the Nim compiler to have support for importing two different JSON modules into the same project, with potential passing of compatible data objects between them, but as far as I know this is a feature that is already planned and in the works.These are just my two cents, but I think it resonates with more people from what I could gather in the discussion. In my opinion the
json2
/std2/json
syntax is extremely ugly and would put me off using Nim if I saw that as one of the first things when I encountered the language.The text was updated successfully, but these errors were encountered: