Commit Graph

22 Commits

Author SHA1 Message Date
theli
5c56b9ed59 *) catch exceptions that could occur during url decoding
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1451 6c8d7289-2bf4-0310-a012-ef5d649a1542
2006-01-26 13:57:49 +00:00
theli
754a35877f *) Changing robots parser cxclusion policy
- crawling is now allowed if server returned a 403 statuscode 
     when trying to download the robots.txt
   See: http://www.yacy-forum.de/viewtopic.php?t=1612

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1421 6c8d7289-2bf4-0310-a012-ef5d649a1542
2006-01-24 08:53:30 +00:00
orbiter
7920e1547d code cleanup
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1163 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-12-05 09:13:13 +00:00
theli
9649d08171 *) More tolerant robots parser
- converting tabs to spaces
   - cutting of '*' in the disallow section

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1056 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-11-11 07:49:54 +00:00
theli
93cadb47b9 *) More tolerant robots parser for robots-files which missing empty lines between rule blocks
See: http://www.yacy-forum.de/viewtopic.php?p=12471

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1048 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-11-08 07:41:25 +00:00
theli
f9fb284fb7 *) Better handling of robots.txt files with incorrect keywords
See: http://www.yacy-forum.de/viewtopic.php?p=12292#12292

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1035 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-11-06 06:01:08 +00:00
theli
b8ceb1ffde *) Adding better https support for crawler
- solving problems with unkown certificates by implementing a dummy trust Manager
   - adding https support to robots-parser 
   - Seed File can now be downloaded from https resources
   - adapting plasmaHTCache.java to support https URLs properly

*) URL Normalization
   - sub URLs are now normalized properly during indexing
   - pointing urlNormalForm function of plasmaParser to htmlFilterContentScraper function
   - normalizing URLs which were received by a crawlOrder request

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1024 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-11-03 15:28:37 +00:00
theli
3b5d0eb053 *) Synchronizing robots.txt downloads to avoid parallel downloads of the same file by separate threads
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@998 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-10-28 09:10:48 +00:00
theli
6c48c3ce39 *) Bugfix for ArithmeticException during IndexTransfer
See: http://www.yacy-forum.de/viewtopic.php?t=1362

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@974 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-10-23 16:07:44 +00:00
theli
02d9af1a70 *) Restructuring and extending of Remote Proxy Support
- remote proxy configuration can now be "really" changed on the fly and takes effect immediately
   - adding possibility to disable remote proxy usage for yacy->yacy communication
   - adding possibility to disable remote proxy usage for ssl
   - restructuring proxy configuration so that it is stored in a single place now

*) Adding possibility to import a foreign word DB (or even more of them in parallel) 
   at runtime into the peers DB
   - this can be done by calling IndexImport_p.html 
   - ATTENTION: please not that at the moment this thread must be aborted via gui
     before a normal server shutdown is done. 
   - TODO: integrating IndexImport Thread into normal server shutdown
   - TODO: Adding posibility to import crawl-queues, etc. from foreign peers
   - TODO: removing old import function from yacy.java and calling the new routines instead

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@968 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-10-22 13:28:04 +00:00
theli
40777556c5 *) Connection Tracking
- adding automatic refresh
   - accepts new parameter nameLookup which can be used to deactivate 
     yacy-peer name lookup (because we have problems with this on large seed-dbs)

*) ViewFile
   New page that can be used to view 
   - original content 
   - plain text content 
   - parsed content
   - parsed sentences 
   of a webpage specified by there url hash
   Mainly for debugging purpose at the moment

*) Robots.txt 
   Bugfix for if-modified-since usage
   TODO: synchronization of downloads to avoid loading the same robots-file 
   multiple times in parallel by different threads

*) Shutdown
   Better abortion of transferRWI and transferURL sessions on server shutdown

*) Status Page
   Adding icon to start/stop crawling via status page

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@950 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-10-18 07:45:27 +00:00
theli
959eefbc4f *) Robots.txt parser/ppt
cutting of comments at the line end
*) Adding Threadpool for stackCrawl Thread to speedup robots.txt download
   and double url checks

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@882 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-10-09 04:43:07 +00:00
theli
a2fa75e688 *) Asynchronous queuing of crawl job URLs (stackCrawl)
various checks like the blacklist check or the robots.txt disallow check are now
   done by a separate thread to unburden the indexer thread(s)
   TODO: maybe we have to introduce a threadpool here if it turn out that this single
         thread is a bottleneck because of the time consuming robots.txt downloads

*) improved index transfer
   The index selection and transmission is done in parallel now to improve index 
   transfer performance.
   TODO: maybe we could speed up performance by unsing multiple transmission threads in 
         parallel instead of only a single one.

*) gzip encoded post requests
   it is now configureable if a gzip encoded post request should be send on
   intex transfer/distribution

*) storage Peer (very experimentell and not optimized yet)
   Now it's possible to send the result of the yacy indexer thread to a remote peer 
   istead of storing the indexed words locally. 
   This could be done by setting the property "storagePeerHash" in the yacy config file
   - Please note that if the index transfer fails, the index ist stored locally.
   - TODO: currently this index transfer is done by the indexer thread. 
     To seedup the indexer
     a) this transmission should be done in parallel and
     b) multiple chunks should be bundled and transfered together


*) general performance improvements  
   - better memory cleanup after http request processing has finished
   - replacing some string concatenations with stringBuffers
   - replacing BufferedInputStreams with serverByteBuffer
   - replacing vectors with arraylists wherever possible
   - replacing hashtables with hashmaps wherever possible
   This was done because function calls to verctor or hashtable functions
   take 3 time longer than calls to functions of arraylists or hashmaps.
   TODO: we should take a look on the class serverObject which is inherited from hashmap
         Do we realy need a synchronization for this class?
   TODO: replace arraylists with linkedLists if random access to the list elements is not needed

*) Robots Parser supports if-modified-since downloads now
   If the downloaded robots.txt file is older than 7 days the robots parser tries to
   download the robots.txt with the if-modified-since header to avoid unnecessary downloads
   if the file was not changed. Additionally the ETag header is used to detect changes.

*) Crawler: better handling of unsupported mimeTypes + FileExtension

*) Bugfix: plasmaWordIndexEntity was not closed correctly in 
   - query.java
   - plasmaswitchboard.java

*) function minimizeUrlDB added to yacy.java 
   this function tests the current urlHashDB for unused urls
   ATTENTION: please don't use this function at the moment because
              it causes the wordIndexDB to flush all words into the
              word directory!

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@853 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-10-05 10:45:33 +00:00
theli
023be89586 *) Bugfix for "Robots.txt wird immer wieder geladen"
See: http://www.yacy-forum.de/viewtopic.php?p=10241#10233

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@794 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-26 08:05:59 +00:00
orbiter
dc474aa22f various bug-fixes
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@792 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-26 01:10:41 +00:00
rramthun
9dfbd93c7b Updated german language file
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@748 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-19 15:44:17 +00:00
theli
2cd695f376 *) Bugfix path-entries of robots.txt were not decoded correctly
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@676 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-07 11:49:53 +00:00
theli
f8ad65eae1 *) First trial implementation of robots.txt support
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@674 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-07 11:17:21 +00:00
allo
9300689dde bugfix *gr*
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@662 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-05 10:46:11 +00:00
allo
ebc39a7b9a minor fixes
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@659 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-05 10:34:26 +00:00
allo
f90f699ab1 missing package line.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@655 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-05 10:04:27 +00:00
allo
06a451768f a simple robotsParser.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@652 6c8d7289-2bf4-0310-a012-ef5d649a1542
2005-09-05 07:35:19 +00:00