Skip to main content.
home | support | download

Back to List Archive

Segmentation Fault w/ Very Long Words

From: Rodney Barnett <rbarnett(at)not-real.neuromics.com>
Date: Sun Sep 01 2002 - 18:22:47 GMT
I just ran into a segmentation fault while using the prog method.  I tracked
the trigger down to a very long "word" (in this case, it was roughly 429,000
characters long).  I certainly don't want that "word" to be indexed, but the
program shouldn't crash either.

It was easy for me to avoid the problem by eliminating the offending data
from the stream I feed to swish-e, so it's not an urgent problem for me.

I was first using swish-e from a snapshot from a week or two ago, but
switched to today's CVS and the problem's still there.

I'm not using libxml2 and I have not changed the MaxWordLimit parameter from
its default.

Are there any other details that are important?

Rodney
Received on Sun Sep 1 18:26:25 2002