Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When client is receiving "long processing" topics, it will be disconnect at some point. #829

Closed
markus-ja opened this issue Dec 19, 2019 · 6 comments · Fixed by #891
Closed
Labels
bug Something isn't working duplicate This issue or pull request already exists

Comments

@markus-ja
Copy link

In my application I want to receive images, do some modifications and save it. To simulate a "worst case" scenarios I put a Thread.Sleep(500) delay in the MqttClient.ApplicationReceiveHandler.

When publishing that topic about 100 times in a for loop, the subscribing client droppes the connection at some point. The server log shows "client xxx has exceeded timeout, disconnecting."

I tried to set the .WithKeepAlivePeriode up to 5 minutes. And then it worked!

It seems, when processing a long receiving queue, the KeepAlive signal could not be processed as long as the topic queue is not dequeued (processed).

@SeppPenner
Copy link
Collaborator

Maybe this is related? #755.

As far as I understood it, this is a bug in the server?

@SeppPenner SeppPenner added the bug Something isn't working label Dec 19, 2019
@markus-ja
Copy link
Author

markus-ja commented Dec 19, 2019

Not sure if it is a server bug.

I logged the following message comming from MqttClient.TrySendKeepAliveMessagesAsync()

[2019-12-19T14:35:17.6474769Z] [] [10] [MqttClient] [Warning]: MQTT communication exception while sending/receiving keep alive packets.
MQTTnet.Exceptions.MqttCommunicationTimedOutException: Exception of type 'MQTTnet.Exceptions.MqttCommunicationTimedOutException' was thrown.
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at MQTTnet.PacketDispatcher.MqttPacketAwaiter`1.<WaitOneAsync>d__4.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at MQTTnet.Client.MqttClient.<SendAndReceiveAsync>d__46`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at MQTTnet.Client.MqttClient.<TrySendKeepAliveMessagesAsync>d__47.MoveNext()

It seems the server doesn't response to the ping (keep alive) request.
I am using server mosquitto v1.6.7 (win-x86)

@SeppPenner
Copy link
Collaborator

I am using server mosquitto v1.6.7 (win-x86)

Maybe Mosquitto has some timeouts set or some size limits?

@ghost
Copy link

ghost commented Dec 20, 2019

Hi, This is the same bug as mentioned in #790.

The issue is with the MqttClient. The ping is sent to the server and the server responds (I checked with wireshark) and then the Mqtt client timesout for some reason (the MqttCommunicationTimeOutException).

We resolved this issue by creating a local queue and storing the received messages there. Then mqtt is only responsible for sending and receiving messages while another task is responsible for executing the received message.

@ghc4
Copy link

ghc4 commented Dec 22, 2019

@Rochlop I'm thinking on the very same approach, but... what are you doing when your local queue is full? Maybe unsubscribe the topic while the local worker consume the local queue and then resubscribe?

@ghost
Copy link

ghost commented Dec 23, 2019

@ghc4 we currently have a background service consuming from the queue and the queued message is then processed in another thread using Task.Run(), so the background service is always able to consume from the queue. Although, in our use case, we do not have more than maybe 10 messages being run at the same time, so I am not so sure that this is the best use case for you.
Your idea about unsubscribing sounds good. Another way may be using mqtt ack messages to decline a message because the queue is too full or send a message count to the server and only have the server send messages if the current message count on the client is below so many.
Otherwise, it is kind of hard to say what would work best for your use cases as I do not really know much about it ;).

PSanetra added a commit to PSanetra/MQTTnet that referenced this issue Apr 3, 2020
This commit fixes issues which were caused by processing messages synchronously in the packet receiver loop. This blocked KeepAlive and Ack packets from being processed while a message was processed.

Fixes dotnet#648
Fixes dotnet#829
PSanetra added a commit to PSanetra/MQTTnet that referenced this issue Apr 4, 2020
This commit fixes issues which were caused by processing messages synchronously in the packet receiver loop. This blocked KeepAlive and Ack packets from being processed while a message was processed.

Fixes dotnet#648
Fixes dotnet#829
PSanetra added a commit to PSanetra/MQTTnet that referenced this issue Apr 4, 2020
This commit fixes issues which were caused by processing messages synchronously in the packet receiver loop. This blocked KeepAlive and Ack packets from being processed while a message was processed.

Fixes dotnet#648
Fixes dotnet#829
PSanetra added a commit to PSanetra/MQTTnet that referenced this issue Apr 4, 2020
This commit fixes issues which were caused by processing messages synchronously in the packet receiver loop. This blocked KeepAlive and Ack packets from being processed while a message was processed.

Fixes dotnet#648
Fixes dotnet#587
Fixes dotnet#829
@SeppPenner SeppPenner added the duplicate This issue or pull request already exists label Apr 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working duplicate This issue or pull request already exists
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants