-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added search-depth-limit parameter #23
Conversation
Limits the depth of cycle reporting to a fixed to a fixied size limit for a list of files. This helps improve performance, as most cycles do not surpass a few related files. As I mentioned in the issue, we could theoretically fix the crashes by switching to an iterative solution. This is still a desirable change IMO because erroring files will do so faster. Fixes #18
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One nit, looks great otherwise. Thanks for the continued contributions @JoshuaKGoldberg!
return [accumulator] | ||
|
||
if (iterationDepth >= context.options.searchDepthLimit) | ||
return [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Worth a console.warn
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally no. Large codebases will hit this a lot.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With this change, when I have a cycle in a big codebase, will I ever find out about it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or do you think the log wouldn't be useful, since there likely isn't a cycle (just deep dep graph?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah that's what I'm thinking. From working in code bases with a few hundred or a few thousand files, most of them just import a few around their own directory. Even 50
is pretty extreme.
...then again... you're right that if this results in nothing, but we know there was at least one cycle and it happens to be ridiculously long, I don't know what we could do intelligently show that.
Sorry for the brevity, on mobile :)
@@ -0,0 +1 @@ | |||
import * as two from "./two"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What are you trying to test with these files?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These validate that the search-depth-limit
option is respected.
one
imports from two
imports from three
imports from four
imports from one
for a depth of four. Adding the option to tslint.json
means these shouldn't cause a lint complaint, which is why I didn't modify the list of expected complaints.
Ping @bcherny - is there anything else that should be done on my end? |
Looks good, sorry about the delay. Published 0.6.0. |
Awesome, thanks! |
Limits the depth of cycle reporting to a fixed to a fixied size limit for a list of files.
This helps improve performance, as most cycles do not surpass a few related files.
As I mentioned in the issue, we could theoretically fix the crashes by switching to an iterative solution. This is still a desirable change IMO because erroring files will do so faster.
Fixes #18 (recreation of #20 but with a good git history)