The Artima Developer Community
Sponsored Link

Agile Buzz Forum
Fixing a Wiki Attack

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
James Robertson

Posts: 29924
Nickname: jarober61
Registered: Jun, 2003

David Buck, Smalltalker at large
Fixing a Wiki Attack Posted: Oct 3, 2006 11:26 AM
Reply to this message Reply

This post originated from an RSS feed registered with Agile Buzz by James Robertson.
Original Post: Fixing a Wiki Attack
Feed Title: Cincom Smalltalk Blog - Smalltalk with Rants
Feed URL: http://www.cincomsmalltalk.com/rssBlog/rssBlogView.xml
Feed Description: James Robertson comments on Cincom Smalltalk, the Smalltalk development community, and IT trends and issues in general.
Latest Agile Buzz Posts
Latest Agile Buzz Posts by James Robertson
Latest Posts From Cincom Smalltalk Blog - Smalltalk with Rants

Advertisement

The UIUC VW Wiki got spammed yesterday - well over a hundred pages. When it's a handful, I manually fix them (unless someone beats me to it). The attack from yesterday was hanging out there though, so I sat down and wrote some workspace script - I just grabbed the page source for all the modified pages on Recent Changes, and stuffed that into a collection - looked like this, but bigger:


strings := #(
'<A href="/VisualWorks/VisualWorks+WebServer+-+history">VisualWorks WebServer - history</A> 18:20:49 (ah1-p4id-56.advancedhosters.com)'
...
).

From there, it was a matter of finding the right page to revert to. This little snippet just pulled the urls out of that mess:


urls := OrderedCollection new.
base := 'http://wiki.cs.uiuc.edu'.
wiki := '/VisualWorks'.
old := 'VERSION'.
rep := 'PROMOTE'.
strings do: [:each | | url |
	stream := each readStream.
	stream through: $".
	url := stream upTo: $".
	urls add: url].

From that, I created the page history urls for each spammed page:


histUrls := OrderedCollection new.
urls2 do: [:each |
	|  url |
	url := base, wiki, '/HISTORY', (each copyReplaceAll: '/VisualWorks' with: '').
	histUrls add: url].

Then, grabbing each page, I scanned down to the second "VERSION" string, grabbed the good version number, and created the appropriate URL to restore the page back to the way it should have been. I added in a delay so that I wasn't doing a DOS attack on the server:


fixUrls := OrderedCollection new.
histUrls do: [:each |
	| content stream next num url tail|
	Transcript show: 'Getting: ', each; cr.
	content := (HttpClient new get: each) contents.
	stream := content readStream.
	stream throughAll: 'VERSION/'.
	stream throughAll: 'VERSION/'.
	stream atEnd ifFalse: [
		next := stream upTo: $/.
		num := next asNumber.
		tail := (UnixFilename named: each) tail.
		url := base, wiki, '/PROMOTE/', num printString, '/', tail.
		fixUrls add: url].
	(Delay forSeconds: 1) wait].


Now, with the set of "fix" urls in hand, I just ran each of them - another delay for the same reason, and a catch for HTTP exceptions - that way, I could cache any pages that didn't get fixed due to transient network errors.


missed := OrderedCollection new.
fixUrls do: [:each |
	Transcript show: 'Fixing: ', each; cr.
	[HttpClient new get: each]
		on: HttpException
		do: [:ex | Transcript show: 'Could not do: ', each; cr.
			missed add: each.
			ex return].
	(Delay forSeconds: 1) wait].

Then, simply rinse, repeat for anything that got missed. All the spammed pages there have been restored, and I didn't have to manually visit each one.

Technorati Tags:

Read: Fixing a Wiki Attack

Topic: Smalltalk, no image Previous Topic   Next Topic Topic: Weekly Log Analysis: 9/30/06

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use