Skip to content

Issues: opea-project/GenAIExamples

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Bug] opea/vllm:latest utilizing just one core vs opea/vllm:1.2 bug Something isn't working
#1532 opened Feb 12, 2025 by devpramod
2 of 8 tasks
[Bug] ChatQnA file upload issue bug Something isn't working
#1530 opened Feb 12, 2025 by zhangyuting
2 of 8 tasks
[Bug] CI breaks due to the .length operator bug Something isn't working CI
#1529 opened Feb 12, 2025 by lianhao
1 of 8 tasks
[Feature] [OIM] Make inference backend configurable feature New feature or request
#1525 opened Feb 11, 2025 by Yu-amd
AgentQnA: Add support for PVC to pass in a volume to hold model and tools good first issue Good for newcomers help wanted Extra attention is needed OPEAHack Issue created for OPEA Hackathon v1.3
#1524 opened Feb 11, 2025 by mkbhanda
AgentQnA - Create a CPU specific helm chart for Kubernetes environments good first issue Good for newcomers help wanted Extra attention is needed OPEAHack Issue created for OPEA Hackathon v1.3
#1523 opened Feb 11, 2025 by mkbhanda
AgentQnA Example on CPU suggests using a managed inference endpoint versus smaller model feature New feature or request good first issue Good for newcomers OPEAHack Issue created for OPEA Hackathon
#1512 opened Feb 11, 2025 by mkbhanda
Update AgentQnA docker compose to use the same database across Intel/CPU and Gaudi good first issue Good for newcomers OPEAHack Issue created for OPEA Hackathon
#1511 opened Feb 11, 2025 by mkbhanda
[Bug] GraphRag dataprep/retriever error bug Something isn't working
#1509 opened Feb 10, 2025 by lianhao
2 of 8 tasks
[Bug] ChatQnA cannot support configuring llm protocol bug Something isn't working
#1504 opened Feb 8, 2025 by Yugar-1
2 of 8 tasks
[Feature] Please provide in each GenAIExample a link to its infrastructure specific helm chart if available documentation Improvements or additions to documentation feature New feature or request
#1500 opened Feb 6, 2025 by mkbhanda
[Feature] Please provide for each example typical resources needed, CPU and memory feature New feature or request OPEAHack Issue created for OPEA Hackathon
#1498 opened Feb 6, 2025 by mkbhanda
[Bug] Productivity Suite : not bug Something isn't working
#1489 opened Feb 1, 2025 by ShankarRIntel
2 of 8 tasks
[Bug] DocSum bug Something isn't working
#1486 opened Jan 30, 2025 by ShankarRIntel
3 of 8 tasks
[Bug] DocIndexRetriever : services dont start bug Something isn't working
#1485 opened Jan 30, 2025 by ShankarRIntel
3 of 8 tasks
[Bug]ChatQnA dataprep not connect to embedding service bug Something isn't working
#1482 opened Jan 29, 2025 by gavinlichn
2 of 8 tasks
[Bug]DocSum timeouts/fails to execute a long task [Regression] bug Something isn't working
#1481 opened Jan 28, 2025 by vrantala
2 of 8 tasks
[Bug] VideoQnA docker compose deployment failing bug Something isn't working
#1479 opened Jan 28, 2025 by ezelanza
2 of 8 tasks
[Bug] VideoQnA failed on Xeon bug Something isn't working
#1477 opened Jan 27, 2025 by chensuyue
2 of 8 tasks
[Bug]AvatarChatbot: Services do not start bug Something isn't working invalid This doesn't seem right
#1475 opened Jan 27, 2025 by ShankarRIntel
3 of 8 tasks
[Bug] dataprep/retriever is not working as expected in ChatQnA with Qdrant bug Something isn't working
#1473 opened Jan 26, 2025 by lianhao
2 of 8 tasks
[Bug]AgentQnA: Couldnt start the svelte ui bug Something isn't working
#1471 opened Jan 26, 2025 by ShankarRIntel
3 of 8 tasks
ProTip! Find all open issues with in progress development work with linked:pr.