Lines Matching refs:urls
9 errors. A subweb is specified by giving one or more ``root URLs''; a
10 page belongs to the subweb if one of the root URLs is an initial
16 the interpretation of ``file:'' URLs is extended to mimic the behavior
32 work status every 50 URLs (adjustable with the -r option), and it
58 The agent name is hardwired to "webchecker". URLs that are disallowed
59 by the robots.txt file are reported as external URLs.
280 # Add a name table, so that the name URLs can be checked. Also
281 # serves as an implicit cache for which URLs are done.
345 urls = self.todo.keys()
346 urls.sort()
347 del urls[self.roundsize:]
348 for url in urls:
379 # to the URL directly, since the URLs in these
391 # All printing of URLs uses format_url(); argument changed to
593 # Because of the way the URLs are now processed, I need to