Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple CPU Core Usage? #40

Closed
eseglem opened this issue Oct 16, 2016 · 4 comments
Closed

Multiple CPU Core Usage? #40

eseglem opened this issue Oct 16, 2016 · 4 comments

Comments

@eseglem
Copy link

eseglem commented Oct 16, 2016

While sanic is super fast on a single process, it still leaves the other cores on the machine idle. I had hoped being 'flask like' it would drop into uWSGI just as easily but that does not appear to be the case in my experimentation. Attempting to use it as a drop in replacement to a flask app returns TypeError: 'Sanic' object is not callable

I do not know if this is even feasible while still keeping the speed improvements, but compatibility with uWSGI, or some other method to fully utilize the CPU of a server would be extremely useful. If there is already a method of doing this, I was unable to find it in the examples or docs and additional information on the method would be extremely useful.

@santagada
Copy link

I think the simple way forward would be to use gunicorn. uWSGI does have support for asyncio but will never (I think) support uvloop (it has its own async loop internally). But supporting both should be possible.

@channelcat
Copy link
Contributor

channelcat commented Oct 17, 2016

I've been looking into this one and after researching I think the answer may be very simple. Sanic uses asyncio's create_server. If we spin up multiple processes and take advantage of the reuse_port feature, we get a simple way to distribute the load without much overhead. I ran some tests and the results so far are much better than I was getting from load balancing with haproxy or nginx.

If we were to deploy this way, scaling could simply be just changing your run line to include workers:

app.run(host="0.0.0.0", port=8000, workers=4)

And then you can still just run it with python :D

@eseglem
Copy link
Author

eseglem commented Oct 17, 2016

Being able to reuse a socket on all the workers would probably be a reasonable solution for it. Also being able to specify that socket would also be nice. Then could point nginx, or whatever, at the socket. I haven't done any testing yet, but it will probably still be useful to have nginx on top for ssl, pki, and things of that nature.

@channelcat
Copy link
Contributor

Added multiprocessing in #57, also added a sock argument to Sanic.run

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants