Skip to main content.
home | support | download

Back to List Archive

Re: large index

From: David Norris <dave(at)not-real.webaugur.com>
Date: Sat Aug 19 2000 - 02:04:11 GMT
Lee Herron wrote:
> I have everything working properly, but I think the OS is shutting

Which version of SWISH-E, OS, compiler, etc?  Sounds like maybe you are
hitting some quota limit (maybe a memory or runtime quota?).  How long
does it run before being killed?  Do you have a debugger or some other
way to see what is killing it?  A backtrace might tell you something
useful.

Have you tried running it nice to see what happens?  I always run
indexing at nice 20 just to keep it from hogging resources needed by
more important processes.

> down the index process (during it, it ends with 'killed') and I'm
> trying to come up with a work around.

Does it pass the make test?

> Currently I'm thinking of collecting all the filenames into a list, 
> breaking up that list into a few smaller sub-list, indexing them 
> with unique indices and merging them all together afterwards. 

That's worth a try.

> (btw, I'm only trying to index 9500 html
> docs -- geez, what better need for a search engine) These are all in
> one directory, so the obvious answer doesn't apply.

I'm not sure what the obvious answer is.  I don't think it's RAM, you'd
get an "out of memory" error or similar.  SWISH-E uses an error-checking
malloc.  Many things come to mind.  9500 documents isn't a large
quantity, but, if they are sufficiently large it could be taking to much
CPU time.

> Any tips or ideas?

Load it into a debugger and try to see what's really going on.

-- 
,David Norris
  Dave's Web - http://www.webaugur.com/dave/
  Dave's Weather - http://www.webaugur.com/dave/wx
  ICQ Universal Internet Number - 412039
  E-Mail - dave@webaugur.com

"I would never belong to a club that would have me as a member!"
                                          - Groucho Marx
Received on Fri Aug 18 19:00:52 2000