-

 

Getting Stuck at Robots.txt (+0/0). - 12/04/2019 Yioop Software Help

It seems like when I try to crawl, the software will just get "stuck" right off the bat when it sees the robots.txt file. I've looked at these files and they aren't disallowing anything. I have resorted to stopping and restarting the queserver and that seems to fix it sometimes. I have also rebooted the machine and that seems to work somethimes. And also simply stopping the crawl and then starting a new crawl with a different name will do the trick. It's really weird and I don't know why it does this. But when it sticks on robots.txt I have let it sit there for 30 minutes before and it really is "stuck"
It seems like when I try to crawl, the software will just get "stuck" right off the bat when it sees the robots.txt file. I've looked at these files and they aren't disallowing anything. I have resorted to stopping and restarting the queserver and that seems to fix it sometimes. I have also rebooted the machine and that seems to work somethimes. And also simply stopping the crawl and then starting a new crawl with a different name will do the trick. It's really weird and I don't know why it does this. But when it sticks on robots.txt I have let it sit there for 30 minutes before and it really is "stuck"
 

-- Getting Stuck at Robots.txt (+0/0). - 14/04/2019 Yioop Software Help

It really does seem like stopping and restarting the queserver fixes this issue
It really does seem like stopping and restarting the queserver fixes this issue
 
[X ]