Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infinite scroll watchers triggers when loaded page is smaller then window size and eats CPU #281

Closed
Skomantas opened this issue May 20, 2016 · 17 comments

Comments

@Skomantas
Copy link

Infinite scroll watchers do not cancel loop (triggering) when page size is smaller then window size, even if page is loaded. downgrade to "ngInfiniteScroll": "1.2.1", from "ngInfiniteScroll":"1.2.2" solves the problem.

@RELnemtzov
Copy link

Having this exact same problem. In my case, I have multiple small "widgets" on a page each with its own infinite-scroll.
CPU is running at 100% after upgrade. Downgrading to 1.2.1 as suggested above restored the expected behavior.

@rubenCodeforges
Copy link

rubenCodeforges commented Jun 1, 2016

@Skomantas and @RELnemtzov ive faced same issue even on 1.2.1 the workaround i found is
to provide a negative value for the infiniteScrollDistance :

<div infinite-scroll="ctrl.yourMethod()" infinite-scroll-distance="ctrl.simpleVariable">

ctrl is used when you work with controllerAs syntax otherwise $scope style is used

//in controller:
var self = this; //this or use $scope


self.simpleVariable = -1 ;

 //use $timeout to make a 100 millis timeout and switch distance to 1
$timeout(function(){
  self.simpleVariable = 1;
}, 100 ) ;

This should help , i think there is a bad calculation of the heights and remainings , while distance is used for those calculations we can cheat a bit.

@Skomantas and @RELnemtzov please reply if it helped

@thephw
Copy link

thephw commented Jun 10, 2016

Also eventually will run out of memory and crash the chrome tab...

@TheNemus TheNemus mentioned this issue Jun 15, 2016
@eugeneware
Copy link

This is a huge bug. Our team spent many days trying to source of this. We've pegged our app to the previous version as suggested to fix this. Please fix!

@rubenCodeforges
Copy link

Guys please confirm if my fix did the trick ?

@neilvana
Copy link

@rubenCodeforges, the fix sometimes works for me. However, based on timing it may still cause issues. I had to fork the library and add the count=1 argument back on the initial $interval call. This causes me issues since my infinitely scrolling data may not be infinitely scrolling. Sometimes my data sets are very large and other times there may only be 1 or 2 records. In this minimal case the CPU goes crazy forever since "shouldScroll" keeps being true. I guess that may not have been the intent of this library which is why I didn't submit a pull request.

@rubenCodeforges
Copy link

rubenCodeforges commented Jun 22, 2016

@neilvana if it sometimes doesnt work , try to increase timeout

$timeout(function(){
  self.simpleVariable = 1;
}, 500 ) ;

@rubenCodeforges
Copy link

@eugeneware we had same issue , hope it will be fixed

@mebibou
Copy link

mebibou commented Jun 23, 2016

I just stumbled upon this problem too: when profiling the page, I saw some code executed every 10ms that was digesting the page, which seemed insane. When looking down at the code, I saw it was coming from the $interval used in the case described here.
Then I looked at the source in the latest release, and on master, and saw that it would be fixed on master here: https://github.com/sroze/ngInfiniteScroll/blob/master/build/ng-infinite-scroll.js#L183

So short story: if you use the version on master, you should be fine, if not, you're screwed. Also, why in hell use a $interval to cancel on first iteration? you know $timeout is made for that right? please fix and release, this is critical!

@rubenCodeforges
Copy link

@mebibou i hope devs will spot this issue

@thabemmz
Copy link

For me this causes a huge memory leak in my browser as well while switching pages that use ng-infinite-scroll, causing some browsers to crash eventually.

What I find surprising is that this issue seems to have been popping up only recently, while the 1.2.2 release was in January. Did anybody experience this issue before the end of May (when this issue was created)?

(it might very well be possible that I didn't catch the memory leak before and this is just strange coincidence, but I wanted to verify).

@dohomi
Copy link

dohomi commented Jul 4, 2016

is this issue fixed with version 1.3?

@graingert
Copy link
Collaborator

@dohomi should be, try it out.

@dohomi
Copy link

dohomi commented Jul 4, 2016

ok I'll do it soon

@mebibou
Copy link

mebibou commented Jul 4, 2016

@dohomi yes it is included in the release

@graingert
Copy link
Collaborator

graingert commented Jul 4, 2016

@mebibou what is included in which release?

@graingert
Copy link
Collaborator

duplicate of #235

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants