It’s just data

Wiki Spam Throttle

def isForbidden(self):
   """ only allow two anonymous posts per hour per ip address"""
   if self.request_method == 'POST' and not
     log=os.path.join(config.data_dir, 'event.log')
     for line in file:
       if line>lasthour and line.find("SAVEPAGE")>=0:
         if self.remote_addr == line.split('&')[1].split('=')[1]:
           if len(pages) >= 2: return True

Update: I've changed it to allow only two pages to be updated per hour. A single page may be updated multiple times without penalty.

Have you ever been comment- or wiki-spammed by spammers using zombie PCs? These days, I routinely see that kind of stuff hit my blog :/

Posted by Robert Sayre at

Is the general experience that comment spam is being automated?

If so why not use captchas? Wez Furlong has a nice text based implementation. Image based captchas can be made to work if hotlinking is prevented (e.g. some kind of session).

Of course if it's armies of spam posting goblins, that's not so cool.

Posted by Harry Fuecks at

Your turn - in Firefox:

XML Parsing Error: not well-formed Location: [link] Line Number 40, Column 31: pages[line.split('&')[0].split('=')[1]]='1' ------------------------------^

Posted by Danny at

Danny: I run FireFox too.  ;-)

Already caught and updated.

Posted by Sam Ruby at

Sam, I saw this kind of spam, but each page was coming from a different IP address. I'm not sure if your solution would work. I corrected it quickly, even deleting it from version control, so that not even the benefit of old releases could be there. It has not happened to me in the last couple of months.

Posted by Santiago Gala at

Santiago: take a look at the Atom wiki's recent changes.

Posted by Sam Ruby at

Have you checked that self.remote_addr is actually the user's remote address and not a proxy's remote address? I mention this mostly because I've been caught out by this recently on livejournal and slashdot when they put in their anti-flood rules.

Here in Spain roughly 80-90% of broadband users use Telefónica ADSL, and Telefónica invisibly proxies every single one of those customers' web requests. This means that at any one point, there's several hundred thousand requests being processed through a relatively small number of proxy servers. Anyway, the upshot of it is, if there's more than one Spanish resident trying to update your wiki, it will likely trip this up if you're not taking the correct remote address into account.

Posted by Giles Antonio Radford at

Sam Ruby: Wiki Spam Throttle


Excerpt from at

"and Telefónica invisibly proxies every single one of those customers' web requests"

Believe AOL does the same. Also Audio captchas.

Posted by Harry Fuecks at

Sam, instead of restricting the good users, why not simply require a login for editing the Wiki?

Posted by Randy Charles Mørin at

There is no restriction for people who login.

"""only allow two anonymous posts per hour per ip address"""
if self.request_method == 'POST' and not

Posted by Sam Ruby at

I've implemented a regex-based spam filter for JSPWiki, which seems work really nicely (deployed it on one site).  I'm also contemplating adding an MT-Blacklist -interface.

The spam filter rules are on a single page, and only trusted users (of which there are plenty) are able to edit that page.

Posted by Janne Jalkanen at

People shouldn't worry about "spam" in their sandbox, though. They should just make sure the sandbox has the robots.txt set so that they won't get the benefit of more links. And maybe an hourly purge!

Posted by Robert at

Sam Ruby: Wiki Spam Throttle...

Excerpt from Guide to Ease at

Trac Site Updated to Trunk

This tuesday, Daniel updated the Trac site to use the current trunk version of Trac. The main motivation behind the update was to get rid of availability problems. In the 0.8 release, we added persistent session support, which required a......

Excerpt from about:cmlenz at

Add your comment