Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how many gpu memory need,run SPHINX ? #112

Closed
swearos opened this issue Nov 21, 2023 · 3 comments
Closed

how many gpu memory need,run SPHINX ? #112

swearos opened this issue Nov 21, 2023 · 3 comments

Comments

@swearos
Copy link

swearos commented Nov 21, 2023

24gb gpu is out of memory.
Would you consider releasing a smaller model, one that can run under 24GB?

@yaomingzhang
Copy link

I also want to know how many resources are needed. Where do you report out of memory, loading the xformer part?

@ChrisLiu6
Copy link
Collaborator

Hi, if you don't have enough gpu memory, please consider using quantization. See the following for example:
https://github.com/Alpha-VLLM/LLaMA2-Accessory/blob/main/accessory/demos/multi_turn_mm_box.py#L77

If you don't wanna quantize the model:
We have not tried with 24G-memory GPUs, but as a rough estimation, the GPU memory cost for hosting SPHINX on two GPUs should be close to 24G (without quantization). So you may be able to successfully run it without quantization after some optimization, but overall it is really extreme.

@gaopengpjlab
Copy link
Contributor

@yaomingzhang @swearos Please refer to issue 114:
#114

@swearos swearos closed this as completed Nov 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants