Thursday, September 9, 2010

Infected website cleaning

Having a lot of websites to monitor many times even in the ones most updated you might find backdoors or other goodies. A couple of days ago a client called and said that google was blocking through safe browsing his website, at least one of them, and another one was popping advertisements when visited. It came to my surprise that those sites were infected because i knew that they were updated recently and to my knowledge there was no vulnerability for the latest update.

Poking around a bit it didn't took long to realize that the infection was done through ftp and there was no sql injection or any other vulnerability in the code of the website.

Most of the time, my work finishes there i will tell the client to clean up the code and change the ftp password. In this case the client was also a good friend and i thought to give him a bit of help there.

The infected files where .php and .html files with the following code,

script src="hxxp://" type="text/javascript"

grep gave the following result,

# grep -R youngarea *| wc
    140     420   16869

140 files infected, cleaning them one by one could take some time.

Sed to the rescue !

The following command is looking at the current directory for files ending with php extension and is replacing the lines starting with < , containing youngarea and ending with > with nothing, thus cleaning up the infection.

find . -name "*.php" -type f | xargs sed -i 's/<.*youngarea.*>//g'

The same command was applied also for the files ending with .html

find . -name "*.html" -type f | xargs sed -i 's/<.*youngarea.*>//g'

and the website was clean in a couple of seconds.

No comments:

Post a Comment