Taskprocessor warnings - any ideas

I am running a current PBXact server with about 500 active extensions, which most are connected using PJsip. It’s running asterisk 16.9.0 for the release, and all modules are current. I am seeing the following errors in the logging, and I am wondering if something needs to be adjusted up for the extension count we have. It’s on a pretty rocking server, so not a problem at all to allow it to use more of the resources on the server.

Here are the errors I am seeing:

[2020-05-26 16:48:57] WARNING[26365][C-00000569] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 16:48:57] WARNING[26367][C-00000569] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 16:52:20] WARNING[29631][C-00000572] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 17:02:05] WARNING[4350][C-00000580] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 17:08:53] WARNING[9307][C-0000058e] taskprocessor.c: The ‘stasis/p:channel:all-000028ef’ task processor queue reached 500 scheduled tasks.
[2020-05-26 17:15:20] WARNING[13284][C-00000598] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 17:23:52] WARNING[15986][C-000005a3] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 17:32:49] WARNING[17784][C-000005a9] taskprocessor.c: The ‘stasis/p:channel:all-00002923’ task processor queue reached 500 scheduled tasks.
[2020-05-26 17:32:49] WARNING[17784][C-000005a9] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 17:39:21] WARNING[20522][C-000005af] taskprocessor.c: The ‘stasis/m:channel:all-00000f53’ task processor queue reached 500 scheduled tasks again.
[2020-05-26 17:40:21] WARNING[18975][C-000005af] taskprocessor.c: The ‘stasis/m:cache_pattern:1/channel:all-00000f52’ task processor queue reached 500 scheduled tasks again.

If anyone can offer any clues, or point me in the right direction it would be great…

https://blogs.asterisk.org/2016/07/13/asterisk-task-processor-queue-size-warnings/

I found that in my searching, actually they are probably the same buy saw the one for Asterisk 16.x at https://wiki.asterisk.org/wiki/display/AST/Asterisk+16+Configuration_stasis on the site. Maybe I am not reading that document right, but I don’t see the error that I am seeing in my log file (stasis/m:cache_pattern:), so I am at a loss as to what needs tweaked.

Also in FreePBX/PBXact it seems there is no stasis.conf, guessing maybe I can just create one in /etc/asterisk but am not sure. Also mine says 500 tasks reached, but that config says max size defaults to 50, but I have no conf and am seeing it hit 500, so again at a bit of a loss.

So for sure seeking some guidance on what exactly I need to adjust, and my google-fu hasn’t really come up with anything solid yet…

Run from SSH

asterisk -rvx "core show taskprocessors"

What do you see that is queued up?

It wouldn’t let me post that much info on there, but the answer is a lot of stuff!

I put the dump of that command into pastebin:

https://pastebin.freepbx.org/view/c0403777

If you add it to pastebin, I will look.

https://wiki.freepbx.org/display/SUP/Providing+Great+Debug#ProvidingGreatDebug-AsteriskLogs-PartII

But in short, do you see a queue backup on any of the line items? If yes, which ones have queued messages?

OK, I updated it to use freepbx’s pastebin, didn’t actually know they had that available, but that works too.

https://pastebin.freepbx.org/view/c0403777

As to queue backups, I show nothing as sitting in queue at the time of the dump, if there is something specific I should be looking for I am happy to do so…

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.