UUENCOD [was: attachments and binary data]

David Relson relson at osagesoftware.com
Mon Dec 6 13:14:38 CET 2004


On Mon, 06 Dec 2004 10:07:52 +0300
Evgeny Kotsuba wrote:

> David Relson wrote:
> 
> >On Fri, 03 Dec 2004 17:02:01 +0300
> >Evgeny Kotsuba wrote:
> >
> >  
> >
> >>Hi, David!
> >>
> >>With last change in lexer_v3.l  from  CVS the  thing like
> >>
> >>    
> >>
> >>> On my AMD 2800, it takes about 30 seconds to process the
> >>>      
> >>>
> >>is  changed  no normal  speed,  but  there still  too many  "words" 
> >>    
> >>
> >from binary  attachments  in data base.
> >  
> >
> ups....
> I wanted write "is changed _to_ normal speed"
> 
> >I've attached another patch that prevents encoded tokens from
spanning
> >lines.  It should help.
> >
> It is the same as in CVS from 2.12.2004 and still have "words" made
from 
> binaries
> 
> 
> SY,
> EK

Evgeny,

The "begin 666" patch has _not_ been included in CVS  for two reasons.

First, I've been waiting to hear how well it works.

Second, you're the only person reporting problems with uuencoded text. 
With so little need for the change, it may not be put into bogofilter
and you may need to apply the patch to new releases before you build
executables.

HTH,

David



-- 
David Relson                   Osage Software Systems, Inc.
relson at osagesoftware.com       Ann Arbor, MI 48103
www.osagesoftware.com          tel:  734.821.8800



More information about the bogofilter-dev mailing list