• Welcome to Sitemap Generator Forum.
 

My infinite website generator always interrupts frequently.Resume last session 0

Started by 234251617, January 09, 2026, 11:36:40 AM

Previous topic - Next topic

234251617

My infinite website generator always interrupts frequently.
Even after setting memory_limit to 1024M and max_execution_time to 0, it still frequently interrupts.
I can only keep repeating Resume last session until Pages crawled: greater than 20000 (added in sitemap),After Continue the interrupted session,The data record is 0


234251617

I haven't used SSH yet and am not familiar with it. How can I run the website generator? Can you provide a detailed explanation of how to run it

XML-Sitemaps Support

You would need to contact your website hosting support for details regarding SSH connection, and then the command line to be used via SSH can be found on the "Create sitemap" page of the sitemap generator interface.

234251617

I've accessed the server via SSH, so how do I run the generator?

Display as follows:
Last failed login: Mon Jan 19 16:28:26 CST 2026 from 92.118.39.145 on ssh:notty
There were 61840 failed login attempts since the last successful login.
Last login: Tnu Jan 8 17:18:48 2026 from ebs-21541
[root@ebs-21541 ~]#

234251617

After SSH accessing the server, should we run the index.php file or runCrawl. php? How to define the previously visualized settings parameters during command-line execution

nohup php runcrawl.php > output.log 2>&1 &

or

nohup php index.php > output.log 2>&1 &

234251617

[root@ebs-21541 generator]# nohup php runcrawl.php > output.log 2>&1 &
[1] 8134
[root@ebs-21541 generator]# ps aux | grep runcrawl.php | grep -v grep
[1]+  Exit 255                nohup php runcrawl.php > output.log 2>&1
[root@ebs-21541 generator]#
The above command line

I tried to connect to the server through SSH, and after running runcrawl. php, ps aux | grep runcrawl.php | grep -v grep  What should I do if the script is running and returns an error code of 255?