You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your quickly response, it's because of the local .env file and app config, run in browser and local command sometimes will mix up the key in redis ...
I've fixed it, I realized that when the queue is running, even I define it in a laravel job, the queue is still running independently, so if I want to stop the queue, I have to kill the worker directly, I just simply define 100 urls each queue to process, and schedule the job every one minute to run. That works for me in real business, although I think still need some adjustment in further to improve the efficient when more and more sites need to be crawed parallelly.
If you know any better solution, pls let me know, much appreciated!
When I set the uniqueId in the queue job file, it won't finished anymore... The queue will be blocked ...
The text was updated successfully, but these errors were encountered: