bogoutil runs forever dumping database?
David Relson
relson at osagesoftware.com
Tue Jun 29 00:13:43 CEST 2004
On Mon, 28 Jun 2004 16:24:37 -0400
Thor Lancelot Simon wrote:
> On Mon, Jun 28, 2004 at 10:17:12PM +0200, Matthias Andree wrote:
> > Thor Lancelot Simon <tls at rek.tjls.com> writes:
> >
> > > Okay, so, db_verify says this:
> > >
> > > > db_verify *.db
> > > db_verify: Last item on page 2616 sorted greater than parent
> > > entry db_verify: DB->verify: wordlist.db: DB_VERIFY_BAD:
> > > Database verification failed
> > >
> > > How can I recover from this condition? My DB foo is pretty rusty.
> >
> > See http://bogofilter.sourceforge.net/faq.php#rescue for quick
> > recovery instructions. Note there is no guarantee that this
> > condition is recoverable.
>
> Well, that doesn't work; db_dump -r just runs forever, just like
> bogoutil -d does.
>
> I can use plain db_dump; that works, but gives me only a small part of
> the data before hanging, so that I have to ^C it. What I really
> need, I assume, is a way to just skip over the bad pages in the
> database; but I cannot find a simple way to do that.
>
> Thor
Thor,
Skipping would be nice, but BerkeleyDB doesn't have such a utility
(AFAIK). The easiest thing is to use whatever spam and ham you have and
build a new database. After that, a safety precaution would be to
periodically run "bogoutil -d" to save a copy of the database.
HTH,
David
More information about the Bogofilter
mailing list