Website Backup and DownloadGood off-site backup solution for database and filesHow Hard It Is to Backup a Website Completely?What files to backup on Lighttpd+MySQL+PHP serverWhat is a ‘best practice’ backup plan for a website?What is a 'best practice' backup plan for a website?Backup website with SSH to offsite locationBackup website without direct SQL access and only FTP accessWhat's the best way to create a static backup of a website?backup website data(image file, database) no downtimeHow to import pop3/imap mails to new server (backup and restore IMAP/POP3 account)?

Mother abusing my finances

What is the intuition behind uniform continuity?

Can't connect to Internet in bash using Mac OS

Is there an evolutionary advantage to having two heads?

Can non-English-speaking characters use wordplay specific to English?

If a massive object like Jupiter flew past the Earth how close would it need to come to pull people off of the surface?

Why is there a need to modify system call tables in linux?

How can a single Member of the House block a Congressional bill?

Is a hash a zero-knowledge proof?

What are the slash markings on Gatwick's 08R/26L?

Modern approach to radio buttons

Tic-Tac-Toe for the terminal

What are the problems in teaching guitar via Skype?

How to prevent bad sectors?

Understanding STM32 datasheet regarding decoupling capacitors

What are the benefits of cryosleep?

Why would Lupin kill Pettigrew?

Why were the Night's Watch required to be celibate?

Preserving culinary oils

chmod would set file permission to 000 no matter what permission i try to set

The qvolume of an integer

Differences between “pas vrai ?”, “c’est ça ?”, “hein ?”, and “n’est-ce pas ?”

How do I subvert the tropes of a train heist?

Uncommanded roll at high speed



Website Backup and Download


Good off-site backup solution for database and filesHow Hard It Is to Backup a Website Completely?What files to backup on Lighttpd+MySQL+PHP serverWhat is a ‘best practice’ backup plan for a website?What is a 'best practice' backup plan for a website?Backup website with SSH to offsite locationBackup website without direct SQL access and only FTP accessWhat's the best way to create a static backup of a website?backup website data(image file, database) no downtimeHow to import pop3/imap mails to new server (backup and restore IMAP/POP3 account)?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








4















How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...










share|improve this question

















  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36

















4















How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...










share|improve this question

















  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36













4












4








4


4






How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...










share|improve this question














How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...







backup






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jul 24 '09 at 11:44









GravitonGraviton

1,230112951




1,230112951







  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36












  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36







2




2





I hope you're not trying to download every page of every subdomain of blogspot.com...

– David Z
Jul 24 '09 at 17:03





I hope you're not trying to download every page of every subdomain of blogspot.com...

– David Z
Jul 24 '09 at 17:03













see also superuser.com/questions/14403/…

– rogerdpack
Aug 30 '13 at 18:36





see also superuser.com/questions/14403/…

– rogerdpack
Aug 30 '13 at 18:36










7 Answers
7






active

oldest

votes


















10














I've found httrack (http://www.httrack.com/) very useful for this in the past.



If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






share|improve this answer























  • +1 because httrack creates a local mirror that you can browse.

    – sybreon
    Jul 24 '09 at 13:21


















6














you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



look here or just check command's manual. wget is available for unix systems and windows.






share|improve this answer

























  • the first link doesn't make sense

    – chaim
    Nov 11 '15 at 8:51


















2














If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



You can also use wget if you have it to get hold of the site information.



Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






share|improve this answer






























    2














    wget I believe will crawl a page for you



    the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



    from the man page



    Wget can follow links in HTML and XHTML pages and create local versions of remote web
    sites, fully recreating the directory structure of the original site. This is sometimes
    referred to as "recursive downloading." While doing that, Wget respects the Robot
    Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
    downloaded HTML files to the local files for offline viewing.





    share|improve this answer

























    • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

      – Student
      May 16 at 2:26


















    1














    If you want something a little more advanced that wget, take a look at Black Widow






    share|improve this answer






























      0














      Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






      share|improve this answer






























        0














        In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






        share|improve this answer























          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "2"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f45096%2fwebsite-backup-and-download%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          7 Answers
          7






          active

          oldest

          votes








          7 Answers
          7






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          10














          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






          share|improve this answer























          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21















          10














          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






          share|improve this answer























          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21













          10












          10








          10







          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






          share|improve this answer













          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Jul 24 '09 at 12:31









          David SpillettDavid Spillett

          21.3k3161




          21.3k3161












          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21

















          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21
















          +1 because httrack creates a local mirror that you can browse.

          – sybreon
          Jul 24 '09 at 13:21





          +1 because httrack creates a local mirror that you can browse.

          – sybreon
          Jul 24 '09 at 13:21













          6














          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.






          share|improve this answer

























          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51















          6














          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.






          share|improve this answer

























          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51













          6












          6








          6







          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.






          share|improve this answer















          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Feb 12 '16 at 14:07

























          answered Jul 24 '09 at 11:48









          pQdpQd

          25.7k35597




          25.7k35597












          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51

















          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51
















          the first link doesn't make sense

          – chaim
          Nov 11 '15 at 8:51





          the first link doesn't make sense

          – chaim
          Nov 11 '15 at 8:51











          2














          If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



          You can also use wget if you have it to get hold of the site information.



          Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






          share|improve this answer



























            2














            If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



            You can also use wget if you have it to get hold of the site information.



            Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






            share|improve this answer

























              2












              2








              2







              If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



              You can also use wget if you have it to get hold of the site information.



              Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






              share|improve this answer













              If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



              You can also use wget if you have it to get hold of the site information.



              Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Jul 24 '09 at 11:48









              Sam CoganSam Cogan

              32k568109




              32k568109





















                  2














                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.





                  share|improve this answer

























                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26















                  2














                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.





                  share|improve this answer

























                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26













                  2












                  2








                  2







                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.





                  share|improve this answer















                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.






                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Jul 24 '09 at 11:54

























                  answered Jul 24 '09 at 11:48









                  revrev

                  9318




                  9318












                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26

















                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26
















                  While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                  – Student
                  May 16 at 2:26





                  While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                  – Student
                  May 16 at 2:26











                  1














                  If you want something a little more advanced that wget, take a look at Black Widow






                  share|improve this answer



























                    1














                    If you want something a little more advanced that wget, take a look at Black Widow






                    share|improve this answer

























                      1












                      1








                      1







                      If you want something a little more advanced that wget, take a look at Black Widow






                      share|improve this answer













                      If you want something a little more advanced that wget, take a look at Black Widow







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Jul 24 '09 at 11:57









                      DentrasiDentrasi

                      3,3121919




                      3,3121919





















                          0














                          Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






                          share|improve this answer



























                            0














                            Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






                            share|improve this answer

























                              0












                              0








                              0







                              Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






                              share|improve this answer













                              Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.







                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered Aug 14 '16 at 12:07









                              DebDeb

                              101




                              101





















                                  0














                                  In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






                                  share|improve this answer



























                                    0














                                    In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






                                    share|improve this answer

























                                      0












                                      0








                                      0







                                      In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






                                      share|improve this answer













                                      In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.







                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered May 16 at 4:07









                                      Khom NazidKhom Nazid

                                      256




                                      256



























                                          draft saved

                                          draft discarded
















































                                          Thanks for contributing an answer to Server Fault!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid


                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.

                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f45096%2fwebsite-backup-and-download%23new-answer', 'question_page');

                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          Wikipedia:Vital articles Мазмуну Biography - Өмүр баян Philosophy and psychology - Философия жана психология Religion - Дин Social sciences - Коомдук илимдер Language and literature - Тил жана адабият Science - Илим Technology - Технология Arts and recreation - Искусство жана эс алуу History and geography - Тарых жана география Навигация менюсу

                                          Bruxelas-Capital Índice Historia | Composición | Situación lingüística | Clima | Cidades irmandadas | Notas | Véxase tamén | Menú de navegacióneO uso das linguas en Bruxelas e a situación do neerlandés"Rexión de Bruxelas Capital"o orixinalSitio da rexiónPáxina de Bruselas no sitio da Oficina de Promoción Turística de Valonia e BruxelasMapa Interactivo da Rexión de Bruxelas-CapitaleeWorldCat332144929079854441105155190212ID28008674080552-90000 0001 0666 3698n94104302ID540940339365017018237

                                          What should I write in an apology letter, since I have decided not to join a company after accepting an offer letterShould I keep looking after accepting a job offer?What should I do when I've been verbally told I would get an offer letter, but still haven't gotten one after 4 weeks?Do I accept an offer from a company that I am not likely to join?New job hasn't confirmed starting date and I want to give current employer as much notice as possibleHow should I address my manager in my resignation letter?HR delayed background verification, now jobless as resignedNo email communication after accepting a formal written offer. How should I phrase the call?What should I do if after receiving a verbal offer letter I am informed that my written job offer is put on hold due to some internal issues?Should I inform the current employer that I am about to resign within 1-2 weeks since I have signed the offer letter and waiting for visa?What company will do, if I send their offer letter to another company