yacy_search_server/htroot/CrawlProfileEditor_p.html
orbiter c48b73cda2 redesign of ranking data structure
- the index administration now uses the same code base for url selection and collection
  as the search interface. The index administration is therefore a good test environment for
  ranking order control
- removed old postsorting-algorithms, will be replaced with new one
- fixed many bugs occurred before during ranking; especially the contraint filtering method
  removed too many links
- fixed media search flags; had been attached to too many urls. The effect should be a better
  pre-sorting before media load within snippet fetch

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@4223 6c8d7289-2bf4-0310-a012-ef5d649a1542
2007-11-21 23:14:57 +00:00

118 lines
4.6 KiB
HTML

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>YaCy '#[clientname]#': Crawl Profile Editor</title>
#%env/templates/metas.template%#
</head>
<body id="IndexCreateWWWGlobalQueue">
#%env/templates/header.template%#
#%env/templates/submenuIndexCreate.template%#
<h2>Crawl Profile Editor</h2>
<p>
Crawl profiles hold information about a specific URL which is internally used to perform the crawl it belongs to.
The profiles for remote crawls, <a href="/ProxyIndexingMonitor_p.html">indexing via proxy</a> and snippet fetches
cannot be altered here as they are hard-coded.
</p>
<!-- crawl profile list -->
<fieldset><legend>Crawl Profile List</legend>
<table border="0" cellpadding="2" cellspacing="1">
<colgroup>
<col width="120" />
<col />
<col width="16" />
<col width="60" />
<col width="10" span="2" />
<col />
<col width="10" span="5" />
</colgroup>
<tr class="TableHeader">
<td><strong>Crawl Thread</strong></td>
<td><strong>Status</strong></td>
<td><strong>Start URL</strong></td>
<td><strong>Depth</strong></td>
<td><strong>Filter</strong></td>
<td><strong>MaxAge</strong></td>
<td><strong>Auto Filter Depth</strong></td>
<td><strong>Auto Filter Content</strong></td>
<td><strong>Max Page Per Domain</strong></td>
<td><strong>Accept '?' URLs</strong></td>
<td><strong>Fill Proxy Cache</strong></td>
<td><strong>Local Text Indexing</strong></td>
<td><strong>Local Media Indexing</strong></td>
<td><strong>Remote Indexing</strong></td>
<td><strong>Status / Action</strong></td>
</tr>
#{crawlProfiles}#
<tr class="TableCell#(dark)#Light::Dark#(/dark)#">
<td>#[name]#</td>
<td>#(status)#terminated::active#(/status)#</td>
<td><a href="#[startURL]#">#[startURL]#</a></td>
<td>#[depth]#</td>
<td>#[filter]#</td>
<td>#[crawlingIfOlder]#</td>
<td>#[crawlingDomFilterDepth]#</td>
<td>#{crawlingDomFilterContent}##[item]#<br />#{/crawlingDomFilterContent}#</td>
<td>#[crawlingDomMaxPages]#</td>
<td>#(withQuery)#no::yes#(/withQuery)#</td>
<td>#(storeCache)#no::yes#(/storeCache)#</td>
<td>#(indexText)#no::yes#(/indexText)#</td>
<td>#(indexMedia)#no::yes#(/indexMedia)#</td>
<td>#(remoteIndexing)#no::yes#(/remoteIndexing)#</td>
<td>#(terminateButton)#::
<div style="text-decoration:blink">Running</div>
<form action="CrawlProfileEditor_p.html" method="get" enctype="multipart/form-data">
<input type="hidden" name="handle" value="#[handle]#" />
<input type="submit" name="terminate" value="Terminate" />
</form>
#(/terminateButton)#
#(deleteButton)#::
Finished
<form action="CrawlProfileEditor_p.html" method="get" enctype="multipart/form-data">
<input type="hidden" name="handle" value="#[handle]#" />
<input type="submit" name="delete" value="Delete" />
</form>
#(/deleteButton)#
</td>
</tr>
#{/crawlProfiles}#
</table>
</fieldset>
<!-- crawl profile editor -->
<form action="CrawlProfileEditor_p.html" method="post" enctype="multipart/form-data">
<fieldset><legend>Select the profile to edit</legend>
<select name="handle">#{profiles}#
<option value="#[handle]#"#(selected)#:: selected="selected"#(/selected)#>#[name]#</option>#{/profiles}#
</select>
<input type="submit" name="edit" value="Edit profile" />
</fieldset>
</form>
#(error)#::
<p class="error">An error occured during editing the crawl profile: #[message]#</p>
#(/error)#
#(edit)#::
<form action="/CrawlProfileEditor_p.html" method="post" enctype="multipart/form-data">
<fieldset><legend>Edit Profile #[name]#</legend>
<input type="hidden" name="handle" value="#[handle]#" />
<dl>#{entries}#
<dt>#(readonly)#<label for="#[name]#">#[label]#</label>::#[label]##(/readonly)#</dt>
<dd>#(readonly)#
<input id="#[name]#" name="#[name]#"
#(type)# type="checkbox"#(checked)#:: checked="checked"#(/checked)#::
type="text" value="#[value]#"::
type="text" value="#[value]#"#(/type)# />::
<strong>#(type)##(checked)#false::true#(/checked)#::#[value]#::#[value]##(/type)#</strong>#(/readonly)#
</dd>#{/entries}#
</dl>
<input type="submit" name="submit" value="Submit changes" />
</fieldset>
</form>
#(/edit)#
#%env/templates/footer.template%#
</body>
</html>