You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
letta [0.6.41]
Trying to use openai api services like
requesty.ai, openrouter .. using latest function call llms like qwq 32B always generating
"stream generator failed." errors
config: using the sdk:
agent = client.agents.create(
name="thedude",
memory_blocks=[
CreateBlock(
label="human",
value="Name: The Dude",
),
CreateBlock(
label="persona",
value="You are a helpful assistant",
),
],
llm_config=LlmConfig(
model="nebius/Qwen/QwQ-32B",
model_endpoint_type="openai",
model_endpoint="https://router.requesty.ai/v1",
api_key=os.getenv("OPENAI_API_KEY"),
context_window=16384
),
embedding_config=EmbeddingConfig(embedding_endpoint_type='hugging-face', embedding_endpoint='https://embeddings.memgpt.ai', embedding_model='letta-free', embedding_dim=1024, embedding_chunk_size=300),
)
ERROR LOG:
File "/app/letta/llm_api/llm_api_tools.py", line 61, in wrapper
letta [0.6.41]
Trying to use openai api services like
requesty.ai, openrouter .. using latest function call llms like qwq 32B always generating
"stream generator failed." errors
ERROR LOG:
File "/app/letta/llm_api/llm_api_tools.py", line 61, in wrapper
File "/app/letta/llm_api/llm_api_tools.py", line 205, in create
File "/app/letta/llm_api/openai.py", line 387, in openai_chat_completions_process_stream
File "/app/letta/llm_api/openai.py", line 255, in openai_chat_completions_process_stream
File "/app/letta/llm_api/openai.py", line 424, in openai_chat_completions_request_stream
File "/app/.venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 279, in wrapper
File "/app/.venv/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 914, in create
File "/app/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1242, in post
File "/app/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 919, in request
File "/app/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1023, in _request
openai.BadRequestError: Error code: 400 - {'error': {'message': ''}}
None
Traceback (most recent call last):
File "/app/letta/server/rest_api/utils.py", line 77, in sse_async_generator
File "/usr/lib/python3.11/asyncio/threads.py", line 25, in to_thread
File "/usr/lib/python3.11/concurrent/futures/thread.py", line 58, in run
File "/app/letta/server/server.py", line 716, in send_messages
File "/app/letta/server/server.py", line 399, in _step
File "/app/letta/tracing.py", line 188, in sync_wrapper
File "/app/letta/agent.py", line 779, in step
File "/app/letta/agent.py", line 1055, in inner_step
File "/app/letta/agent.py", line 892, in inner_step
File "/app/letta/tracing.py", line 188, in sync_wrapper
File "/app/letta/agent.py", line 455, in _get_ai_reply
File "/app/letta/agent.py", line 414, in _get_ai_reply
File "/app/letta/tracing.py", line 188, in sync_wrapper
File "/app/letta/llm_api/llm_api_tools.py", line 118, in wrapper
File "/app/letta/llm_api/llm_api_tools.py", line 61, in wrapper
File "/app/letta/llm_api/llm_api_tools.py", line 205, in create
File "/app/letta/llm_api/openai.py", line 387, in openai_chat_completions_process_stream
File "/app/letta/llm_api/openai.py", line 255, in openai_chat_completions_process_stream
File "/app/letta/llm_api/openai.py", line 424, in openai_chat_completions_request_stream
File "/app/.venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 279, in wrapper
File "/app/.venv/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 914, in create
File "/app/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1242, in post
File "/app/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 919, in request
File "/app/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1023, in _request
openai.BadRequestError: Error code: 400 - {'error': {'message': ''}}
/app/letta/server/rest_api/utils.py:133: UserWarning: SSE stream generator failed: Error code: 400 - {'error': {'message': ''}}
warnings.warn(f"SSE stream generator failed: {e}")
Letta.letta.server.rest_api.utils - ERROR - Caught unexpected Exception: Error code: 400 - {'error': {'message': ''}}
maybe these 'services' are not supporting streaming?, tool use, function calls? anyone successfully using any llms from requesty.ai, ?
The text was updated successfully, but these errors were encountered: