-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coral Usb Accelerator Support #69
Comments
Hey! That sounds really cool! I will see whether I'm able to integrate this without sacrificing ease of setup.. The main point of this app is to make AI easily usable on your own data, after all. Still, it would of course be cool, if people can optionally extend the performance with more config options. I'll first have to figure out GPU support.. (see #67) |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This issue is essential in enabling fast recognition on low-end devices. Just imagine how long it would take to classify a decent photo library on an old laptop or a Raspberry Pi. |
This issue is far more complex than we thought. |
Which is more likely to be done first, GPU support or Coral support? I ask because I have a "low-end" device (my mid 2010 mac pro tower), and would love to add this to my Nextcloud, so I can back out of Google Photos. It is low-end because it missed the AVX instruction set by one year. |
@phirestalker GPU is more likely, IMO. Also note, that what is still being called "JS-mode" in the UI is now much faster in the latest release, because we're using WASM now.
It's also worth noting that you can compile tensorflow yourself without requiring AVX. I once set out to automatically build a range of tensorflow flavors using github actions, but never quite finished. |
+1 for adding Coral support |
I investigated this a bit and it turns out that you can run tflite models on coral, but you have to compile them especially. Here is a code example: https://github.com/tensorflow/sig-tfjs/tree/main/tfjs-tflite-node-codelab/coral_inference_working |
Would also very much like to see a coral implementation. Would be willing to test things out and report on any issues and/or help getting this up and running. |
+1 for coral support here too. |
I run NextCloud in a VM on my home server, so it would be much easier to get an accelerator like the Coral TPU connected to the VM than it will ever be to get any amount of GPU acceleration in the near future as the hardware compatible with ESXi VGPUs is incredibly limited and generally outside my budget just to get speedy image tagging. |
+1 for Coral support. |
Did anything come out of this? |
We are currently maintaining this app on a limited effort basis. This means Nextcloud GmbH will not invest further development resources ourselves in advancing this app with new features. That doesn't mean there will be no new features, however: We do review and enthusiastically welcome community Pull Requests. We would be more than excited if you would like to collaborate with us on this issue. Feel free to reach out here in the comments if you would like to work on this and have questions about how to go about it, how to proceed or would like a short introduction call into the code base. I'm here to help with your questions ✌️ (See #779 for more information on this) |
If you are interested in this feature you might want to upvote #73 which is a prerequisite for this. The more upvotes an issue has the more likely it is that I get to spend time working on it :) |
Looking forward to testing your app out but i was curious if you had put any thought/effort into supporting edge TPUs like the Coral Accelerator? https://coral.ai/products/accelerator/
I use https://github.com/blakeblackshear/frigate/ for my NVR and it does a great job with the coral accelerator and really speeds up the tensorflow detection (like a TON).
Would be a great addition to this app!
The text was updated successfully, but these errors were encountered: