mirror of
https://github.com/yacy/yacy_search_server.git
synced 2024-09-19 00:01:41 +02:00
First update of german language file + corresponding changes in HTML-file. Many will follow...
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@349 6c8d7289-2bf4-0310-a012-ef5d649a1542
This commit is contained in:
parent
bef3aaec38
commit
35c7a5883b
|
@ -32,7 +32,7 @@
|
|||
|
||||
<p>
|
||||
<div class=small id="startCrawling"><b>Start Crawling Job:</b>
|
||||
You can define url's as start points for Web page crawling and start that crawling here.
|
||||
You can define URLs as start points for Web page crawling and start that crawling here.
|
||||
</div>
|
||||
<table border="0" cellpadding="5" cellspacing="0" width="100%">
|
||||
<form action="IndexCreate_p.html" method="post" enctype="multipart/form-data">
|
||||
|
@ -87,7 +87,7 @@ You can define url's as start points for Web page crawling and start that crawli
|
|||
</td>
|
||||
</tr>
|
||||
<tr valign="top" class="TableCellDark">
|
||||
<td class=small>Do Remote Indexing</td>
|
||||
<td class=small>Do Remote Indexing:</td>
|
||||
<td class=small><input type="checkbox" name="crawlOrder" align="top" #(crawlOrderChecked)#::checked#(/crawlOrderChecked)#></td>
|
||||
<td class=small colspan="3">
|
||||
If checked, the crawl will try to assign the leaf nodes of the search tree to remote peers.
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#English=German
|
||||
<!-- lang -->default\(english\)=Deutsch
|
||||
<!-- author -->=Übersetzer eintragen
|
||||
<!-- author -->=Roland Ramthun
|
||||
|
||||
#-----------------------------------------------------------
|
||||
#index.html
|
||||
|
@ -16,20 +16,25 @@ order by:=sortiert nach:
|
|||
>Date-Quality=>Datum-Qualität
|
||||
Resource:=Quelle:
|
||||
Max. search time \(seconds\)=Max. Suchzeit (Sekunden)
|
||||
"URL mask":=URL-Maske
|
||||
URL mask:=URL-Maske:
|
||||
|
||||
|
||||
#-------------------------------------------------------
|
||||
#IndexCreate_p.html
|
||||
Index Creation=Index Erzeugung
|
||||
You can define url's as start points for Web page crawling and start that crawling here.=Du kannst hier URLs angeben, die gecrawlt werden sollen und dann das Crawling starten.
|
||||
You can define URLs as start points for Web page crawling and start that crawling here.=Du kannst hier URLs angeben, die gecrawlt werden sollen und dann das Crawling starten.
|
||||
Crawling Depth:=Crawl-Tiefe:
|
||||
Crawling Filter:=Crawl-Maske:
|
||||
<fehlt>
|
||||
Store to Proxy Cache:=In den Proxy-Cache speichern:
|
||||
Do Local Indexing:=Lokales Indexieren:
|
||||
Do Remote Indexing:=Indexieren auf anderen Peers:
|
||||
Exclude static Stop-Words:=Stop-Words ausschließen:
|
||||
Start Point:=Startpunkt:
|
||||
|
||||
#-------------------------------------------------------
|
||||
#Status.html
|
||||
|
||||
#Das passt hier, aber wenn wo anders Protection steht eher nicht..,
|
||||
Protection=Sicherheit
|
||||
|
||||
Welcome to YaCy!=Willkommen bei YaCy!
|
||||
System version=Systemversion
|
||||
Proxy host=Proxy Host
|
||||
|
|
Loading…
Reference in New Issue
Block a user