Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak with native fetch on Node v19+ #2143

Closed
kamilkisiela opened this issue May 31, 2023 · 10 comments · Fixed by #3510
Closed

Memory leak with native fetch on Node v19+ #2143

kamilkisiela opened this issue May 31, 2023 · 10 comments · Fixed by #3510

Comments

@kamilkisiela
Copy link

kamilkisiela commented May 31, 2023

Version

v20.2.0

Platform

Darwin Kamils-MacBook-Pro.local 22.5.0 Darwin Kernel Version 22.5.0: Mon Apr 24 20:52:24 PDT 2023; root:xnu-8796.121.2~5/RELEASE_ARM64_T6000 arm64

Subsystem

No response

What steps will reproduce the bug?

  1. Git clone kamilkisiela/node-fetch-leak repo
  2. Steps to reproduce the memory leak are in README.md

How often does it reproduce? Is there a required condition?

Always

What is the expected behavior? Why is that the expected behavior?

Constant memory usage

What do you see instead?

Increase in memory (heap snapshots available)

Additional information

To reproduce the memory leak I created a simple node:http server that calls native fetch() on every incoming HTTP request.
I ran a load test against that server and measured the memory (heap snapshots available in the repository).
Next, I ran the same steps for Node v18.16 to compare the memory consumption.

It seems like Node v19, v20 are leaking but v18 seems stable.

Requests made v20.2 + fetch v20.2 + undici v5.22.1 v18.16 + fetch v18.16 + undici v5.22.1
~300k 16MB 19MB 9MB 11MB
~600k 23MB 26MB 9MB 11MB

Node v20.2

Native fetch

Requests made Memory
295k 16MB
575k 23MB

Undici v5.22.1

Requests made Memory
365k 19MB
650k 26MB

Node v18.16

Native fetch

Requests made Memory
360k 9MB
680k 9MB

Undici v5.22.1

Requests made Memory
366k 11MB
678k 11MB

Related to nodejs/node#47936

@aduh95 aduh95 transferred this issue from nodejs/node May 31, 2023
@kamilkisiela
Copy link
Author

Because it was moved to nodejs/undici I pushed a commit with numbers and heap snapshots for node + undici.

@mcollina
Copy link
Member

mcollina commented Jul 2, 2023

Does it eventually lead to a crash, or is it just an increase in memory usage?

@kamilkisiela
Copy link
Author

I ran it only for 360s and the only thing I noticed is the increase of memory usage.

I also updated kamilkisiela/graphql-hive repo to Node 20 and we got to OOM pretty quickly there (but it’s another memory leak, related to http module of node, not fetch api but these two might be related).

@metcoder95
Copy link
Member

metcoder95 commented Jul 2, 2023

Did you take only two snapshots of each case?

For the behavior...
The memory grows exponentially, or did you notice drops in the consumption after X seconds?

I'll try to visit this one this week

@kamilkisiela
Copy link
Author

I haven't went deep there, just noticed a memory leak and reproduced it. I only looked at the memory usage twice, right after sending low amount of traffic to warm the server and then right after sending huge traffic.

@mcollina
Copy link
Member

mcollina commented Jul 3, 2023

Based on the data we have, this doesn't look like a leak but rather an increase in memory usage. Given this happens across different versions of Node.js/V8, it would likely be a change in the garbage collector algorithm, and nothing something we are responsible for.

Note that fetch requires a few "bad" patterns that are hard on the GC (finalizers).

@kamilkisiela
Copy link
Author

Let's say I run it now for 6 minutes (as I did), then 12 and 24 minutes and the memory keeps increasing. It would mean it's a leak no?

@targos
Copy link
Member

targos commented Jul 3, 2023

Can you try running node with for example --max-old-space-size=128? If there is a real memory leak, the process should crash when used memory reaches around 128MB.

@kamilkisiela
Copy link
Author

Just to clarify, the reproduction is:

  1. Warm up the server: k6 run k6.js --vus 5 --duration 10s
  2. Run first set of requests: k6 run k6.js --vus 5 --duration 360s and make the heap snapshot
  3. Run second set of requests: k6 run k6.js --vus 5 --duration 360s and make the heap snapshot

The memory goes up after each step. I could run 3rd step 10 more times and I believe the memory will goes up as well and at some point it will go to space with Elon (OOM).

@mcollina
Copy link
Member

mcollina commented Jul 3, 2023

Let's say I run it now for 6 minutes (as I did), then 12 and 24 minutes and the memory keeps increasing. It would mean it's a leak no?

No it does not mean it's a leak. When V8 is under pressure CPU-wise, it tends to not cleanup memory unless it is absolutely needed. This happens after memory is transferred to "old space".

The command @targos mentioned will definitely shows if there are issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants