Skip to main content.
home | support | download

Back to List Archive

Re: generation of large index failed with swish-e 2.4.3

From: swishe <swishe(at)>
Date: Tue Feb 01 2005 - 00:09:35 GMT

thank you for your fast response.
We are glad that we always get help from you, Peter and other
helpful swish-e enthusiasts ;-)

After I wrote my mail to the swish-e-list we've tried one more thing.
We've used the config.nice with LARGEFILE support and called
make clean/install to build the swish-e version with LARGEFILE
support again. After this we tried to generate the index for our
~ 3.000.000 records. 6 hours later it has been built !
swish-e is really a very powerful software !! Thanks to all
of you who developed it !

I have to explain that we had two version of swish-e 2.4.3 on the 
same machine. One version without LARGEFILE support and one with:
   swish-e-largefile / swish-e-smallfile
We used a link in /usr/local/bin to toggle between both versions
cause we've seen that our indexfiles which were built with
swish-e-largefile are much greater than these with swish-e-smallfile
which seems to be clear cause of the 32/64 bit presentation for
data structures.

So we decided to use both versions on one server cause most of our
indexes are < 2 GB and for the rare cases where our index file
needs largefile support we wanted to call swish-e-largefile.

Do you know the reason why this idea did not work?

Best wishes, Uwe 

Uwe Dierolf
University of Karlsruhe - University Library
P.O.Box 6920, 76049 Karlsruhe, Germany
phone(fax) :  49/721/608-6076(4886)
www        :

Am Mon, Jan 31, 2005 at 08:52:27AM -0800 schrieb Bill Moseley:
> On Mon, Jan 31, 2005 at 12:39:03AM -0800, swishe wrote:
> > Using the "-e" option swish-e processed all XML records (id 1111111135
> > is the last record in our XML file). But it stopped working without
> > any error message in the logfile and without generating a core dump.
> > end of logfile:
> >    1111111135 - Using XML2 parser -  (20 words)
> > 
> >    Removing very common words...
> >    no words removed.
> >    Writing main index...
> >    Sorting words ...
> >    Sorting 14,395,157 words alphabetically
> >    Writing header ...
> >    Writing index entries ...
> >      Writing word text: ...
> So it's just stopping at this point without any error message or core
> dump?   Could the OS be killing the process?
> Not many people are indexing that many documents, but still I'd expect
> to see some error message.
> -- 
> Bill Moseley
Received on Mon Jan 31 16:09:42 2005