libgmime [was Re: lexer, tokens, and content-types]

David Relson relson at osagesoftware.com
Wed Dec 11 13:31:39 CET 2002


At 07:27 AM 12/11/02, Greg Louis wrote:

>On 20021210 (Tue) at 1412:50 -0500, Scott Lenser wrote:
>
> > > > uuencode
> > >
> > > we don't need that. test will not be uuencoded because it would be
> > > unreadable for netscape then.
> > >
> >
> > I don't understand why we don't need this but I can't find a single
> > spam that is uuencoded so I'll assume you are right.
>
>I can find nonspams that are uuencoded and they make an awful
>mess in the training database.  I haven't looked for uuencoded spams.


Greg,

Have you applied Allyn's patch to lexer.l?  It recognizes the 61 character 
uuencode lines, such as you've encountered, and throws them away.

David






More information about the Bogofilter mailing list