mirror of
https://github.com/yacy/yacy_search_server.git
synced 2024-09-24 00:00:19 +02:00
a31b9097a4
two main changes must be implemented to enable mass remote crawls: - shift control of robots.txt to crawl queue (away from stacker). This is necessary since remote crawls can contain unchecked urls. Each peer must check the robots to prevent that it is misused as crawl agent for unwanted file retrieval - implement new index files that control double-check of remotely crawled urls After removal of robots.txt checking from stacker threads, the multi-threading of this process is void. Multithreading has been removed. Also the thread pools for the crawl threads had been removed, since creation of these threads is not resource-consuming, for a detailed explanation see svn 4106 git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@4181 6c8d7289-2bf4-0310-a012-ef5d649a1542
42 lines
1.2 KiB
HTML
42 lines
1.2 KiB
HTML
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
|
|
<html xmlns="http://www.w3.org/1999/xhtml">
|
|
<head>
|
|
<title>YaCy '#[clientname]#': Loader Queue</title>
|
|
#%env/templates/metas.template%#
|
|
</head>
|
|
<body id="IndexCreateLoaderQueue">
|
|
#%env/templates/header.template%#
|
|
#%env/templates/submenuIndexCreate.template%#
|
|
<h2>Loader Queue</h2>
|
|
|
|
<p>
|
|
#(loader-set)#
|
|
The loader set is empty</p>
|
|
::
|
|
There are #[num]# entries in the loader set:</p>
|
|
<table border="0" cellpadding="2" cellspacing="1">
|
|
<colgroup>
|
|
<col width="60" />
|
|
<col width="10" />
|
|
<col />
|
|
</colgroup>
|
|
<tr class="TableHeader">
|
|
<th>Initiator</th>
|
|
<th>Depth</th>
|
|
<th>Status</th>
|
|
<th>URL</th>
|
|
</tr>
|
|
#{list}#
|
|
<tr class="TableCell#(dark)#Light::Dark#(/dark)#">
|
|
<td>#[initiator]#</td>
|
|
<td>#[depth]#</td>
|
|
<td>#[status]#</td>
|
|
<td><a href="#[url]#">#[url]#</a></td>
|
|
</tr>
|
|
#{/list}#
|
|
</table>
|
|
#(/loader-set)#
|
|
#%env/templates/footer.template%#
|
|
</body>
|
|
</html>
|