Skip to main content.
home | support | download

Back to List Archive

Re: swish-e and Mac OS X

From: Bill Moseley <moseley(at)not-real.hank.org>
Date: Wed Jul 11 2001 - 13:32:18 GMT
At 06:06 AM 07/11/01 -0700, Chris Blackstone wrote:
>Anyone have problems running swish-e-2.1 dev under Mac OS X? 
>I have installed everything correctly, but whenever I try to index my
>site using either the http method or the prog method (using spider.pl)
>the swish-e process quickly takes over 200 Mb of memory and the machine
>grinds to a halt.
>
>I ran into this before with a FreeBSD server, and thanks to Bil Moseley,
>it seemed to solve the problem, but now it's back.

You mean it happens with the http method?  Does running spider.pl without
swish eat memory, too?  Still have those test programs I sent?  Do those
eat RAM on OS X?

Are you using the spider.pl from the distribution or the one I sent you by
email?  That is, are you using the same spider.pl as you are using with
your FreeBSD setup?

I also thought when you were having problems with FreeBSD that it worked on
OS X.

Just in case any perl experts out there:  The problem was that the program
spiders recursively -- and the recurse lists contain URI objects.  Making a
minor change to use scalars instead of blessed scalars (URI objects) in the
lists fixed the problem on FreeBSD.  Weird.

Anyone have ideas on how to track this down?  comp.lang.perl.moderated
didn't get me anywhere.


Another approach is to use a single list of URLs and avoid recursion - push
on spidered links, shift of pages to spider.   But then there's the real
issue of more memory usage due to tracking parent pages and URL depth for
each element of the array.




Bill Moseley
mailto:moseley@hank.org
Received on Wed Jul 11 13:32:32 2001