Skip to main content.
home | support | download

Back to List Archive

Re: Grouping results

From: Bill Moseley <moseley(at)not-real.hank.org>
Date: Sun Dec 14 2003 - 14:30:32 GMT
On Sun, Dec 14, 2003 at 12:05:05AM -0800, John Angel wrote:
> Bill, is this added to official to-do list? :)

For internal to swish?  No.
For swish.cgi? No, because it works with the swish-e binary -- to fill 
out the pages correctly (when there's duplicates) you need to be able to 
continue to read results.

> 
> 
> ----- Original Message ----- 
> From: "Bill Moseley" <moseley@hank.org>
> To: "Multiple recipients of list" <swish-e@sunsite.berkeley.edu>
> Sent: Monday, December 01, 2003 23:37
> Subject: [SWISH-E] Re: Grouping results
> 
> 
> > On Mon, Dec 01, 2003 at 01:47:15PM -0800, John Angel wrote:
> > > That way there will be less than 10 results per page.
> > >
> > > E.g. what if all 10 results on page are from the same site, there will
> be
> > > only 2 results displayed?
> >
> > Well, that's what I meant when I said you would need to do some post
> > processing.  So instead of saying pages start at 0, 10, 20,... you would
> > have to track better and just offer previous and next.
> >
> > So on the first page you fetch enough results to make a complete page.
> > Then look ahead for the first record on the "next" page and then pass
> > that as the starting location in your links (to the next page).
> > "Previous Page" would also need to be tracked in links because you can't
> just
> > subtract 20 from the current location.
> >
> > Regardless, you would want to use the API so you can easily scan through
> > all the results.
> >
> > BTW -- the result list that swish maintains doesn't have backwards
> > links, IIRC.  SwishSeek() just starts at the beginning of the linked
> > list and walks (runs?) the list looking for the requested entry.  When
> > seaching multiple indexes (and sorting by path) swish has to read all
> > the pathnames off disk when sorting). So, in other words, you
> > may want to avoid seeking too many times.
> >
> >
> > >
> > >
> > > >From: Bill Moseley <moseley@hank.org>
> > > >Reply-To: moseley@hank.org
> > > >To: Multiple recipients of list <swish-e@sunsite.berkeley.edu>
> > > >Subject: [SWISH-E] Re: Grouping results
> > > >Date: Tue, 25 Nov 2003 13:28:52 -0800 (PST)
> > > >
> > > >On Tue, Nov 25, 2003 at 01:24:41PM -0800, Bill Moseley wrote:
> > > > > On Sun, Nov 23, 2003 at 12:45:23PM -0800, John Angel wrote:
> > > > > > Is it possible to group results by site like on Google (to display
> > > >only 2
> > > > > > hits from the same site, not all of them)?
> > > > >
> > > > > Did I already respond to this?
> > > > >
> > > > > You would have to post-process;  Need to think about what to do if
> > > > > showing a page of results at a time -- you might come up short.
> > > > >
> > > > > Fake code:
> > > > >
> > > > > my %seen;
> > > > > while ( my $result = next_result() ) {
> > > > >     my $uri = URI->new( $result->swishdocpath );
> > > > >     next if $seen{ $uri->host }++ == 2;
> > > >
> > > >I assume you want something more like >= 2.
> > > >
> > > >
> > > > >     show_result( $result );
> > > > > }
> > > > >
> > > > > --
> > > > > Bill Moseley
> > > > > moseley@hank.org
> > > > >
> > > > >
> > > >
> > > >--
> > > >Bill Moseley
> > > >moseley@hank.org
> > > >
> > >
> > > _________________________________________________________________
> > > Tired of spam? Get advanced junk mail protection with MSN 8.
> > > http://join.msn.com/?page=features/junkmail
> > >
> > >
> >
> > -- 
> > Bill Moseley
> > moseley@hank.org
> >
> >
> 

-- 
Bill Moseley
moseley@hank.org
Received on Sun Dec 14 14:30:40 2003