Website Backup and DownloadGood off-site backup solution for database and filesHow Hard It Is to Backup a Website Completely?What files to backup on Lighttpd+MySQL+PHP serverWhat is a ‘best practice’ backup plan for a website?What is a 'best practice' backup plan for a website?Backup website with SSH to offsite locationBackup website without direct SQL access and only FTP accessWhat's the best way to create a static backup of a website?backup website data(image file, database) no downtimeHow to import pop3/imap mails to new server (backup and restore IMAP/POP3 account)?
Mother abusing my finances
What is the intuition behind uniform continuity?
Can't connect to Internet in bash using Mac OS
Is there an evolutionary advantage to having two heads?
Can non-English-speaking characters use wordplay specific to English?
If a massive object like Jupiter flew past the Earth how close would it need to come to pull people off of the surface?
Why is there a need to modify system call tables in linux?
How can a single Member of the House block a Congressional bill?
Is a hash a zero-knowledge proof?
What are the slash markings on Gatwick's 08R/26L?
Modern approach to radio buttons
Tic-Tac-Toe for the terminal
What are the problems in teaching guitar via Skype?
How to prevent bad sectors?
Understanding STM32 datasheet regarding decoupling capacitors
What are the benefits of cryosleep?
Why would Lupin kill Pettigrew?
Why were the Night's Watch required to be celibate?
Preserving culinary oils
chmod would set file permission to 000 no matter what permission i try to set
The qvolume of an integer
Differences between “pas vrai ?”, “c’est ça ?”, “hein ?”, and “n’est-ce pas ?”
How do I subvert the tropes of a train heist?
Uncommanded roll at high speed
Website Backup and Download
Good off-site backup solution for database and filesHow Hard It Is to Backup a Website Completely?What files to backup on Lighttpd+MySQL+PHP serverWhat is a ‘best practice’ backup plan for a website?What is a 'best practice' backup plan for a website?Backup website with SSH to offsite locationBackup website without direct SQL access and only FTP accessWhat's the best way to create a static backup of a website?backup website data(image file, database) no downtimeHow to import pop3/imap mails to new server (backup and restore IMAP/POP3 account)?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...
backup
add a comment |
How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...
backup
2
I hope you're not trying to download every page of every subdomain of blogspot.com...
– David Z
Jul 24 '09 at 17:03
see also superuser.com/questions/14403/…
– rogerdpack
Aug 30 '13 at 18:36
add a comment |
How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...
backup
How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...
backup
backup
asked Jul 24 '09 at 11:44
GravitonGraviton
1,230112951
1,230112951
2
I hope you're not trying to download every page of every subdomain of blogspot.com...
– David Z
Jul 24 '09 at 17:03
see also superuser.com/questions/14403/…
– rogerdpack
Aug 30 '13 at 18:36
add a comment |
2
I hope you're not trying to download every page of every subdomain of blogspot.com...
– David Z
Jul 24 '09 at 17:03
see also superuser.com/questions/14403/…
– rogerdpack
Aug 30 '13 at 18:36
2
2
I hope you're not trying to download every page of every subdomain of blogspot.com...
– David Z
Jul 24 '09 at 17:03
I hope you're not trying to download every page of every subdomain of blogspot.com...
– David Z
Jul 24 '09 at 17:03
see also superuser.com/questions/14403/…
– rogerdpack
Aug 30 '13 at 18:36
see also superuser.com/questions/14403/…
– rogerdpack
Aug 30 '13 at 18:36
add a comment |
7 Answers
7
active
oldest
votes
I've found httrack (http://www.httrack.com/) very useful for this in the past.
If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.
+1 because httrack creates a local mirror that you can browse.
– sybreon
Jul 24 '09 at 13:21
add a comment |
you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].
look here or just check command's manual. wget is available for unix systems and windows.
the first link doesn't make sense
– chaim
Nov 11 '15 at 8:51
add a comment |
If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.
You can also use wget if you have it to get hold of the site information.
Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.
add a comment |
wget I believe will crawl a page for you
the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.
from the man page
Wget can follow links in HTML and XHTML pages and create local versions of remote web
sites, fully recreating the directory structure of the original site. This is sometimes
referred to as "recursive downloading." While doing that, Wget respects the Robot
Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
downloaded HTML files to the local files for offline viewing.
While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.
– Student
May 16 at 2:26
add a comment |
If you want something a little more advanced that wget, take a look at Black Widow
add a comment |
Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.
add a comment |
In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget
doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack
is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "2"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f45096%2fwebsite-backup-and-download%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
7 Answers
7
active
oldest
votes
7 Answers
7
active
oldest
votes
active
oldest
votes
active
oldest
votes
I've found httrack (http://www.httrack.com/) very useful for this in the past.
If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.
+1 because httrack creates a local mirror that you can browse.
– sybreon
Jul 24 '09 at 13:21
add a comment |
I've found httrack (http://www.httrack.com/) very useful for this in the past.
If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.
+1 because httrack creates a local mirror that you can browse.
– sybreon
Jul 24 '09 at 13:21
add a comment |
I've found httrack (http://www.httrack.com/) very useful for this in the past.
If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.
I've found httrack (http://www.httrack.com/) very useful for this in the past.
If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.
answered Jul 24 '09 at 12:31
David SpillettDavid Spillett
21.3k3161
21.3k3161
+1 because httrack creates a local mirror that you can browse.
– sybreon
Jul 24 '09 at 13:21
add a comment |
+1 because httrack creates a local mirror that you can browse.
– sybreon
Jul 24 '09 at 13:21
+1 because httrack creates a local mirror that you can browse.
– sybreon
Jul 24 '09 at 13:21
+1 because httrack creates a local mirror that you can browse.
– sybreon
Jul 24 '09 at 13:21
add a comment |
you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].
look here or just check command's manual. wget is available for unix systems and windows.
the first link doesn't make sense
– chaim
Nov 11 '15 at 8:51
add a comment |
you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].
look here or just check command's manual. wget is available for unix systems and windows.
the first link doesn't make sense
– chaim
Nov 11 '15 at 8:51
add a comment |
you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].
look here or just check command's manual. wget is available for unix systems and windows.
you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].
look here or just check command's manual. wget is available for unix systems and windows.
edited Feb 12 '16 at 14:07
answered Jul 24 '09 at 11:48
pQdpQd
25.7k35597
25.7k35597
the first link doesn't make sense
– chaim
Nov 11 '15 at 8:51
add a comment |
the first link doesn't make sense
– chaim
Nov 11 '15 at 8:51
the first link doesn't make sense
– chaim
Nov 11 '15 at 8:51
the first link doesn't make sense
– chaim
Nov 11 '15 at 8:51
add a comment |
If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.
You can also use wget if you have it to get hold of the site information.
Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.
add a comment |
If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.
You can also use wget if you have it to get hold of the site information.
Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.
add a comment |
If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.
You can also use wget if you have it to get hold of the site information.
Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.
If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.
You can also use wget if you have it to get hold of the site information.
Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.
answered Jul 24 '09 at 11:48
Sam CoganSam Cogan
32k568109
32k568109
add a comment |
add a comment |
wget I believe will crawl a page for you
the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.
from the man page
Wget can follow links in HTML and XHTML pages and create local versions of remote web
sites, fully recreating the directory structure of the original site. This is sometimes
referred to as "recursive downloading." While doing that, Wget respects the Robot
Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
downloaded HTML files to the local files for offline viewing.
While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.
– Student
May 16 at 2:26
add a comment |
wget I believe will crawl a page for you
the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.
from the man page
Wget can follow links in HTML and XHTML pages and create local versions of remote web
sites, fully recreating the directory structure of the original site. This is sometimes
referred to as "recursive downloading." While doing that, Wget respects the Robot
Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
downloaded HTML files to the local files for offline viewing.
While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.
– Student
May 16 at 2:26
add a comment |
wget I believe will crawl a page for you
the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.
from the man page
Wget can follow links in HTML and XHTML pages and create local versions of remote web
sites, fully recreating the directory structure of the original site. This is sometimes
referred to as "recursive downloading." While doing that, Wget respects the Robot
Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
downloaded HTML files to the local files for offline viewing.
wget I believe will crawl a page for you
the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.
from the man page
Wget can follow links in HTML and XHTML pages and create local versions of remote web
sites, fully recreating the directory structure of the original site. This is sometimes
referred to as "recursive downloading." While doing that, Wget respects the Robot
Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
downloaded HTML files to the local files for offline viewing.
edited Jul 24 '09 at 11:54
answered Jul 24 '09 at 11:48
revrev
9318
9318
While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.
– Student
May 16 at 2:26
add a comment |
While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.
– Student
May 16 at 2:26
While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.
– Student
May 16 at 2:26
While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.
– Student
May 16 at 2:26
add a comment |
If you want something a little more advanced that wget, take a look at Black Widow
add a comment |
If you want something a little more advanced that wget, take a look at Black Widow
add a comment |
If you want something a little more advanced that wget, take a look at Black Widow
If you want something a little more advanced that wget, take a look at Black Widow
answered Jul 24 '09 at 11:57
DentrasiDentrasi
3,3121919
3,3121919
add a comment |
add a comment |
Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.
add a comment |
Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.
add a comment |
Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.
Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.
answered Aug 14 '16 at 12:07
DebDeb
101
101
add a comment |
add a comment |
In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget
doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack
is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.
add a comment |
In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget
doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack
is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.
add a comment |
In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget
doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack
is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.
In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget
doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack
is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.
answered May 16 at 4:07
Khom NazidKhom Nazid
256
256
add a comment |
add a comment |
Thanks for contributing an answer to Server Fault!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f45096%2fwebsite-backup-and-download%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
I hope you're not trying to download every page of every subdomain of blogspot.com...
– David Z
Jul 24 '09 at 17:03
see also superuser.com/questions/14403/…
– rogerdpack
Aug 30 '13 at 18:36