Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deploy sadtalker with trition by cuda optimization can see here. real time. #9

Open
foocker opened this issue Oct 7, 2023 · 19 comments

Comments

@foocker
Copy link

foocker commented Oct 7, 2023

SadTalkerTriton
25 fps , 84 frame, cost 2.4s。3090ti

@Spycsh
Copy link
Owner

Spycsh commented Oct 8, 2023

@foocker great work! 2.4s also includes face enhancing (gfpgan) time, right?

@foocker
Copy link
Author

foocker commented Oct 9, 2023

@foocker great work! 2.4s also includes face enhancing (gfpgan) time, right?

no, the result is ok, so i removed it.

@Spycsh
Copy link
Owner

Spycsh commented Oct 9, 2023

@foocker great work! 2.4s also includes face enhancing (gfpgan) time, right?

no, the result is ok, so i removed it.

Cool!

@tbridelbertomeu
Copy link

@foocker Do you still have the repo?

@foocker
Copy link
Author

foocker commented Aug 9, 2024

@foocker Do you still have the repo?

yes,but close as private.

@tbridelbertomeu
Copy link

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

@foocker
Copy link
Author

foocker commented Aug 12, 2024

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

its open,only few day.

@tbridelbertomeu
Copy link

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

its open,only few day.

Great, thank you! Is it still SadTalkerTriton?

@foocker
Copy link
Author

foocker commented Aug 12, 2024

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

its open,only few day.

Great, thank you! Is it still SadTalkerTriton?

yes

@tbridelbertomeu
Copy link

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

its open,only few day.

Great, thank you! Is it still SadTalkerTriton?

yes

Thanks @foocker 👌

@tbridelbertomeu
Copy link

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

its open,only few day.

Great, thank you! Is it still SadTalkerTriton?

yes

Thanks @foocker 👌

@foocker Sorry, when you mention you exported all submodels to onnx (esp. generator, kp_detector, etc...) , what models from SadTalker are you referring to?

@foocker
Copy link
Author

foocker commented Aug 19, 2024

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

its open,only few day.

Great, thank you! Is it still SadTalkerTriton?

yes

Thanks @foocker 👌

@foocker Sorry, when you mention you exported all submodels to onnx (esp. generator, kp_detector, etc...) , what models from SadTalker are you referring to?

url in readme is all you need.

@tbridelbertomeu
Copy link

@foocker Do you still have the repo?

yes,but close as private.

Is there any chance we can get in touch in private and talk about it?

its open,only few day.

Great, thank you! Is it still SadTalkerTriton?

yes

Thanks @foocker 👌

@foocker Sorry, when you mention you exported all submodels to onnx (esp. generator, kp_detector, etc...) , what models from SadTalker are you referring to?

url in readme is all you need.

Hey @foocker

I saw the URL and downloaded the content, but the generator.onnx is missing. Do you have it?

@foocker
Copy link
Author

foocker commented Aug 20, 2024 via email

@tbridelbertomeu
Copy link

Alright, thanks!

@tbridelbertomeu
Copy link

@foocker I can't run it out of the box - the *_reference.txt you left does not work with any python3 I tried. Could you help me again with the version of python you were using? And the 22.11 triton image you were using? Thanks!!

@zepingcan
Copy link

zepingcan commented Aug 23, 2024

@foocker 老哥好,我在复现您代码的时候,triton抛出了个错误,好像是说weights没有指定输出形状。

我是按照你的readme的第四步 4. In server container: tritonserver --model-repo ./ 运行,然后抛出的错误

您方便看一下是不是这个问题吗?或者我可能别的地方哪里错了

如果您感觉回github麻烦,我微信是EST_Tracer,我是在北美做自动驾驶的,有空也可以多交流!

image

@foocker
Copy link
Author

foocker commented Aug 24, 2024

@foocker 老哥好,我在复现您代码的时候,triton抛出了个错误,好像是说weights没有指定输出形状。

我是按照你的readme的第四步 4. In server container: tritonserver --model-repo ./ 运行,然后抛出的错误

您方便看一下是不是这个问题吗?或者我可能别的地方哪里错了

如果您感觉回github麻烦,我微信是EST_Tracer,我是在北美做自动驾驶的,有空也可以多交流!

image

你自动驾驶的跑这个干嘛?这个版本都有点老了,我好久没搞这个,看起来需要重新转一下onnx,我记得里面有转的脚本?不行就按照我的思路,在最新版本上搞一下吧。

@zepingcan
Copy link

@foocker 老哥好,我在复现您代码的时候,triton抛出了个错误,好像是说weights没有指定输出形状。
我是按照你的readme的第四步 4. In server container: tritonserver --model-repo ./ 运行,然后抛出的错误
您方便看一下是不是这个问题吗?或者我可能别的地方哪里错了
如果您感觉回github麻烦,我微信是EST_Tracer,我是在北美做自动驾驶的,有空也可以多交流!
image

你自动驾驶的跑这个干嘛?这个版本都有点老了,我好久没搞这个,看起来需要重新转一下onnx,我记得里面有转的脚本?不行就按照我的思路,在最新版本上搞一下吧。

工作原因,有个新项目要测试。现在它跑起来了,我留点instructions给后人参考:
我上面遇到的错误,和你跑这个repo可能遇到的绝大多数问题,可能是triton、ONNX runtime, tensorrt、cuda或cudnn的版本不对齐造成的。查询正确的对齐版本,从下面两个链接:
https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements
https://docs.nvidia.com/deeplearning/triton-inference-server/release-notes/rel_20-03.html
在对齐版本时要以requirements里的triton版本为主。
只要版本解决了,其他的都是小问题。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants