English and German Forums for YaCy: uncensorable, untraceable search engines and freedom of information.
Orbiter hat geschrieben:There is actually a hardcoded limitation to 2 documents per second for the same domain. This is done in connection with a proper identification of the crawler as 'yacybot'. The combination of the limitation and the identification of the crawler is promise to web hosters that YaCy is a good behaving robot and does not overload web services.
Mitglieder in diesem Forum: 0 Mitglieder und 1 Gast