Hi there...
First off, very excited to be testing Yioop software for a possible project I've been wanting to revisit.
Dev server I'm testing with is Xeon D-1521 @ 2.4Ghz , 32G RAM and 2x480G SSD RAID1 drives. Debian 8 64 bit OS.
Whatever I load into the Crawl section starts by gathering robots.txt from URL's and then just seemingly stops processing. I thought at first I was doing something wrong between "Allowed To Crawl Sites" and "Seed Sites" but pretty sure I understand those definitions. I removed my custom entries and went back to the "Yioop Defaults" and figured I'd use that for a test crawl for a bit to see things functional. Unfortunately that doesn't work neither for me.
Is this a configuration issue or something with PHP?
Thanks for any assistance or thoughts ... appreciate it.
Paul
Hi there...
First off, very excited to be testing Yioop software for a possible project I've been wanting to revisit.
Dev server I'm testing with is Xeon D-1521 @ 2.4Ghz , 32G RAM and 2x480G SSD RAID1 drives. Debian 8 64 bit OS.
Whatever I load into the Crawl section starts by gathering robots.txt from URL's and then just seemingly stops processing. I thought at first I was doing something wrong between "Allowed To Crawl Sites" and "Seed Sites" but pretty sure I understand those definitions. I removed my custom entries and went back to the "Yioop Defaults" and figured I'd use that for a test crawl for a bit to see things functional. Unfortunately that doesn't work neither for me.
Is this a configuration issue or something with PHP?
Thanks for any assistance or thoughts ... appreciate it.
Paul