Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deadlock when downloading git dependencies #2987

Closed
matklad opened this issue Aug 13, 2016 · 9 comments · Fixed by #2991
Closed

Deadlock when downloading git dependencies #2987

matklad opened this issue Aug 13, 2016 · 9 comments · Fixed by #2991

Comments

@matklad
Copy link
Member

matklad commented Aug 13, 2016

Repro:

$ rm -rf ~/.cargo/registry
$ git clone [email protected]:tokio-rs/tokio.git
$ cd tokio
$ cargo metadata& 
$ cargo metadata& 

Updating registry `https://github.com/rust-lang/crates.io-index`  
Blocking waiting for file lock on the registry index 

Blocking waiting for file lock on the git checkout
Blocking waiting for file lock on the git checkout
@matklad
Copy link
Member Author

matklad commented Aug 13, 2016

Looks like the same issue as #2566.

@matklad
Copy link
Member Author

matklad commented Aug 13, 2016

@matklad
Copy link
Member Author

matklad commented Aug 14, 2016

So looks like a textbook deadlock: git sources keep locks while other git sources are being updated, and the locks are captured in no particular order (resolution graph is non deterministic because of HashMaps).

Looks like making locking order deterministic is not an option: how would one order git dependencies from different packages?

@alexcrichton
Copy link
Member

Yeah this happens frequently with two git deps, and you're right that it's because git locks aren't ordered. Unfortunately it's actually not possible to order locks because we don't know the set of locks we need to take at the beginning, or at least not with the current strategy.

Some options we have here are:

  • Keep track of all locks held, and also impose a global ordering. If we acquire a lock we need to drop all locks after it in the ordering, then grab all locks again.
  • Hold locks temporarily during resolution and then grab them afterwards once we know the whole set.

I'd probably prefer the latter option (or another alternative) to solve this, dropping and re-acquiring locks sounds like a pain!

@matklad
Copy link
Member Author

matklad commented Aug 14, 2016

What about using a single lock for all git deps?

Hm, and what if the resolution DAG includes the same git dependency twice, but with different revisions? Is it true that it is impossible to build a package with such dependencies because both revisions should be checked out to the same directory? EDIT: apparently this is false, because they will be checked out to different directories.

@matklad
Copy link
Member Author

matklad commented Aug 14, 2016

Some more alternatives:

  • just bail out with an error if we try to lock a repository and it is already locked. Sort of conservative deadlock detection.
  • always checkout a precise revision, so that the lock is needed only during the checkout itself. That is, instead of using mutable branches, use append only commit hashes.

@alexcrichton
Copy link
Member

Hm I think I like the single lock idea actually. It's pretty bad in terms of UI (makes the feature almost unusable), but it should be rarer than crates.io and it's better UI than deadlocking. So sounds good to me!

matklad added a commit to matklad/cargo that referenced this issue Aug 14, 2016
@matklad
Copy link
Member Author

matklad commented Aug 14, 2016

The question is who should own that single FileLock? The only "global" object readily available in the git source is Config. Stuffing a RefCell<Option<FileLock>> inside the Config would probably work, but perhaps there is a better way?

@alexcrichton
Copy link
Member

@matklad yeah I'd expect Config to hold it as it's the "global state" of Cargo

bors added a commit that referenced this issue Aug 15, 2016
Deadlocks with git dependencies

Only a test for now to verify that it actually fails on Travis.

Hopefully will close #2987
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants