yacy_search_server/source/net/yacy/cora
orbiter 14764632b5 clear solr caches in case that an exception occurrs. The reason behind
this hack is the occurrence of Exceptions like:
W 2014/02/11 18:51:33 ConcurrentLog GC overhead limit exceeded
java.io.IOException: GC overhead limit exceeded
        at
net.yacy.cora.federate.solr.connector.AbstractSolrConnector.getDocumentById(AbstractSolrConnector.java:334)
        at
net.yacy.cora.federate.solr.connector.MirrorSolrConnector.getDocumentById(MirrorSolrConnector.java:173)
        at
net.yacy.cora.federate.solr.connector.ConcurrentUpdateSolrConnector.getDocumentById(ConcurrentUpdateSolrConnector.java:415)
        at net.yacy.search.index.Fulltext.getMetadata(Fulltext.java:331)
        at net.yacy.search.index.Fulltext.getMetadata(Fulltext.java:317)
        at
net.yacy.search.query.SearchEvent.pullOneRWI(SearchEvent.java:1024)
        at
net.yacy.search.query.SearchEvent.pullOneFilteredFromRWI(SearchEvent.java:1047)
        at
net.yacy.search.query.SearchEvent$3.run(SearchEvent.java:1263)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
        at java.util.Arrays.copyOfRange(Arrays.java:3077)
        at java.lang.StringCoding.decode(StringCoding.java:196)
        at java.lang.String.<init>(String.java:491)
        at java.lang.String.<init>(String.java:547)
        at
org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.readField(CompressingStoredFieldsReader.java:187)
        at
org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.visitDocument(CompressingStoredFieldsReader.java:351)
        at
org.apache.lucene.index.SegmentReader.document(SegmentReader.java:276)
        at
org.apache.lucene.index.BaseCompositeReader.document(BaseCompositeReader.java:110)
        at
org.apache.lucene.index.IndexReader.document(IndexReader.java:436)
        at
org.apache.solr.search.SolrIndexSearcher.doc(SolrIndexSearcher.java:657)
        at
net.yacy.cora.federate.solr.connector.EmbeddedSolrConnector.SolrQueryResponse2SolrDocumentList(EmbeddedSolrConnector.java:230)
        at
net.yacy.cora.federate.solr.connector.EmbeddedSolrConnector.getDocumentListByParams(EmbeddedSolrConnector.java:320)
        at
net.yacy.cora.federate.solr.connector.AbstractSolrConnector.getDocumentById(AbstractSolrConnector.java:330)
        ... 7 more
        
This problem was analysed with the Eclipse Memory Analyser after a heap
dump, where the following problem was reported as the main Problem
Suspect:

One instance of "org.apache.solr.util.ConcurrentLRUCache" loaded by
"sun.misc.Launcher$AppClassLoader @ 0x42e940a0" occupies 902.898.256
(61,80%) bytes. The memory is accumulated in one instance of
"java.util.concurrent.ConcurrentHashMap$Segment[]" loaded by "<system
class loader>".

This memory is part of the result cache of Solr. Flushing this cache
appears the most appropriate solution to that problem.
2014-02-11 20:56:40 +01:00
..
ai Added 'final' for all exception blocks as this helps the Java compiler 2013-07-17 18:31:30 +02:00
date extended the Scheduler: introduced scheduled events 2012-12-22 16:27:14 +01:00
document removed warnings and superfluous logging 2014-02-09 12:26:58 +01:00
federate clear solr caches in case that an exception occurrs. The reason behind 2014-02-11 20:56:40 +01:00
geo - the webgraph shall store all links which appear on a web page and not 2013-09-15 00:30:23 +02:00
language set more logger to 'final static' 2013-11-13 06:18:48 +01:00
lod removed jena library and all code that depended on jena. When jena was 2014-02-07 01:20:06 +01:00
order added test to Base64Order (runs successfully!) 2013-11-22 10:38:42 +01:00
plugin added phonetic classes 2011-12-14 17:33:18 +01:00
protocol enhancements for staticIP and ipv6 handling 2014-01-27 13:48:20 +01:00
sorting Added 'final' for all exception blocks as this helps the Java compiler 2013-07-17 18:31:30 +02:00
storage replaced old caching in SolrConnector with a new one which is better for 2014-01-15 23:13:22 +01:00
util removed warnings and superfluous logging 2014-02-09 12:26:58 +01:00