Website Backup and DownloadGood off-site backup solution for database and filesHow Hard It Is to Backup a Website Completely?What files to backup on Lighttpd+MySQL+PHP serverWhat is a ‘best practice’ backup plan for a website?What is a 'best practice' backup plan for a website?Backup website with SSH to offsite locationBackup website without direct SQL access and only FTP accessWhat's the best way to create a static backup of a website?backup website data(image file, database) no downtimeHow to import pop3/imap mails to new server (backup and restore IMAP/POP3 account)?

Mother abusing my finances

What is the intuition behind uniform continuity?

Can't connect to Internet in bash using Mac OS

Is there an evolutionary advantage to having two heads?

Can non-English-speaking characters use wordplay specific to English?

If a massive object like Jupiter flew past the Earth how close would it need to come to pull people off of the surface?

Why is there a need to modify system call tables in linux?

How can a single Member of the House block a Congressional bill?

Is a hash a zero-knowledge proof?

What are the slash markings on Gatwick's 08R/26L?

Modern approach to radio buttons

Tic-Tac-Toe for the terminal

What are the problems in teaching guitar via Skype?

How to prevent bad sectors?

Understanding STM32 datasheet regarding decoupling capacitors

What are the benefits of cryosleep?

Why would Lupin kill Pettigrew?

Why were the Night's Watch required to be celibate?

Preserving culinary oils

chmod would set file permission to 000 no matter what permission i try to set

The qvolume of an integer

Differences between “pas vrai ?”, “c’est ça ?”, “hein ?”, and “n’est-ce pas ?”

How do I subvert the tropes of a train heist?

Uncommanded roll at high speed



Website Backup and Download


Good off-site backup solution for database and filesHow Hard It Is to Backup a Website Completely?What files to backup on Lighttpd+MySQL+PHP serverWhat is a ‘best practice’ backup plan for a website?What is a 'best practice' backup plan for a website?Backup website with SSH to offsite locationBackup website without direct SQL access and only FTP accessWhat's the best way to create a static backup of a website?backup website data(image file, database) no downtimeHow to import pop3/imap mails to new server (backup and restore IMAP/POP3 account)?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








4















How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...










share|improve this question

















  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36

















4















How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...










share|improve this question

















  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36













4












4








4


4






How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...










share|improve this question














How to download the whole website in general and *.blogspot.com in particular? Note that I don't necessary have admin access to that website. In fact I am just trying to download a third party website just in case it goes into flames...







backup






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jul 24 '09 at 11:44









GravitonGraviton

1,230112951




1,230112951







  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36












  • 2





    I hope you're not trying to download every page of every subdomain of blogspot.com...

    – David Z
    Jul 24 '09 at 17:03











  • see also superuser.com/questions/14403/…

    – rogerdpack
    Aug 30 '13 at 18:36







2




2





I hope you're not trying to download every page of every subdomain of blogspot.com...

– David Z
Jul 24 '09 at 17:03





I hope you're not trying to download every page of every subdomain of blogspot.com...

– David Z
Jul 24 '09 at 17:03













see also superuser.com/questions/14403/…

– rogerdpack
Aug 30 '13 at 18:36





see also superuser.com/questions/14403/…

– rogerdpack
Aug 30 '13 at 18:36










7 Answers
7






active

oldest

votes


















10














I've found httrack (http://www.httrack.com/) very useful for this in the past.



If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






share|improve this answer























  • +1 because httrack creates a local mirror that you can browse.

    – sybreon
    Jul 24 '09 at 13:21


















6














you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



look here or just check command's manual. wget is available for unix systems and windows.






share|improve this answer

























  • the first link doesn't make sense

    – chaim
    Nov 11 '15 at 8:51


















2














If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



You can also use wget if you have it to get hold of the site information.



Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






share|improve this answer






























    2














    wget I believe will crawl a page for you



    the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



    from the man page



    Wget can follow links in HTML and XHTML pages and create local versions of remote web
    sites, fully recreating the directory structure of the original site. This is sometimes
    referred to as "recursive downloading." While doing that, Wget respects the Robot
    Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
    downloaded HTML files to the local files for offline viewing.





    share|improve this answer

























    • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

      – Student
      May 16 at 2:26


















    1














    If you want something a little more advanced that wget, take a look at Black Widow






    share|improve this answer






























      0














      Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






      share|improve this answer






























        0














        In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






        share|improve this answer























          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "2"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f45096%2fwebsite-backup-and-download%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          7 Answers
          7






          active

          oldest

          votes








          7 Answers
          7






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          10














          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






          share|improve this answer























          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21















          10














          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






          share|improve this answer























          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21













          10












          10








          10







          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.






          share|improve this answer













          I've found httrack (http://www.httrack.com/) very useful for this in the past.



          If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Jul 24 '09 at 12:31









          David SpillettDavid Spillett

          21.3k3161




          21.3k3161












          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21

















          • +1 because httrack creates a local mirror that you can browse.

            – sybreon
            Jul 24 '09 at 13:21
















          +1 because httrack creates a local mirror that you can browse.

          – sybreon
          Jul 24 '09 at 13:21





          +1 because httrack creates a local mirror that you can browse.

          – sybreon
          Jul 24 '09 at 13:21













          6














          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.






          share|improve this answer

























          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51















          6














          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.






          share|improve this answer

























          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51













          6












          6








          6







          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.






          share|improve this answer















          you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].



          look here or just check command's manual. wget is available for unix systems and windows.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Feb 12 '16 at 14:07

























          answered Jul 24 '09 at 11:48









          pQdpQd

          25.7k35597




          25.7k35597












          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51

















          • the first link doesn't make sense

            – chaim
            Nov 11 '15 at 8:51
















          the first link doesn't make sense

          – chaim
          Nov 11 '15 at 8:51





          the first link doesn't make sense

          – chaim
          Nov 11 '15 at 8:51











          2














          If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



          You can also use wget if you have it to get hold of the site information.



          Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






          share|improve this answer



























            2














            If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



            You can also use wget if you have it to get hold of the site information.



            Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






            share|improve this answer

























              2












              2








              2







              If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



              You can also use wget if you have it to get hold of the site information.



              Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.






              share|improve this answer













              If you don't have admin access to the site to use the backup tool for the site then you could backup the HTML contents of your pages, from viewing the source the source, or if you just want the actual written content of articles, copy that. You can also download your images and other attachments from the site. This article gives you details of how you could do that in a more efficient way.



              You can also use wget if you have it to get hold of the site information.



              Bear in mind though that this will not give you the info you need to just take your blog and run it somewhere else, there is a whole PHP backend behind blogspot that is loading your sites etc.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Jul 24 '09 at 11:48









              Sam CoganSam Cogan

              32k568109




              32k568109





















                  2














                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.





                  share|improve this answer

























                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26















                  2














                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.





                  share|improve this answer

























                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26













                  2












                  2








                  2







                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.





                  share|improve this answer















                  wget I believe will crawl a page for you



                  the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.



                  from the man page



                  Wget can follow links in HTML and XHTML pages and create local versions of remote web
                  sites, fully recreating the directory structure of the original site. This is sometimes
                  referred to as "recursive downloading." While doing that, Wget respects the Robot
                  Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in
                  downloaded HTML files to the local files for offline viewing.






                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Jul 24 '09 at 11:54

























                  answered Jul 24 '09 at 11:48









                  revrev

                  9318




                  9318












                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26

















                  • While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                    – Student
                    May 16 at 2:26
















                  While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                  – Student
                  May 16 at 2:26





                  While backing up a blog with wget, I found it useful to add the "--span-host" option for the links/images that are not hosted in the same domain.

                  – Student
                  May 16 at 2:26











                  1














                  If you want something a little more advanced that wget, take a look at Black Widow






                  share|improve this answer



























                    1














                    If you want something a little more advanced that wget, take a look at Black Widow






                    share|improve this answer

























                      1












                      1








                      1







                      If you want something a little more advanced that wget, take a look at Black Widow






                      share|improve this answer













                      If you want something a little more advanced that wget, take a look at Black Widow







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Jul 24 '09 at 11:57









                      DentrasiDentrasi

                      3,3121919




                      3,3121919





















                          0














                          Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






                          share|improve this answer



























                            0














                            Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






                            share|improve this answer

























                              0












                              0








                              0







                              Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.






                              share|improve this answer













                              Easiest way to download an entire website is using Website Downloader. With nothing to install or configure, simply input the website url you want to download and press download.







                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered Aug 14 '16 at 12:07









                              DebDeb

                              101




                              101





















                                  0














                                  In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






                                  share|improve this answer



























                                    0














                                    In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






                                    share|improve this answer

























                                      0












                                      0








                                      0







                                      In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.






                                      share|improve this answer













                                      In case this helps someone. SiteSucker is a super Mac OSX app that does what you want. wget doesn't always do what you'd expect especially with JS based menus, etc, even with its plethora of options. And httrack is not a visual tool on Mac, it's installed with homebrew. SiteSucker is by far the simplest and the most reliable tool for local download of all html and assets, as if you're running a full static version of that site.







                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered May 16 at 4:07









                                      Khom NazidKhom Nazid

                                      256




                                      256



























                                          draft saved

                                          draft discarded
















































                                          Thanks for contributing an answer to Server Fault!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid


                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.

                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f45096%2fwebsite-backup-and-download%23new-answer', 'question_page');

                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          Club Baloncesto Breogán Índice Historia | Pavillón | Nome | O Breogán na cultura popular | Xogadores | Adestradores | Presidentes | Palmarés | Historial | Líderes | Notas | Véxase tamén | Menú de navegacióncbbreogan.galCadroGuía oficial da ACB 2009-10, páxina 201Guía oficial ACB 1992, páxina 183. Editorial DB.É de 6.500 espectadores sentados axeitándose á última normativa"Estudiantes Junior, entre as mellores canteiras"o orixinalHemeroteca El Mundo Deportivo, 16 setembro de 1970, páxina 12Historia do BreogánAlfredo Pérez, o último canoneiroHistoria C.B. BreogánHemeroteca de El Mundo DeportivoJimmy Wright, norteamericano do Breogán deixará Lugo por ameazas de morteResultados de Breogán en 1986-87Resultados de Breogán en 1990-91Ficha de Velimir Perasović en acb.comResultados de Breogán en 1994-95Breogán arrasa al Barça. "El Mundo Deportivo", 27 de setembro de 1999, páxina 58CB Breogán - FC BarcelonaA FEB invita a participar nunha nova Liga EuropeaCharlie Bell na prensa estatalMáximos anotadores 2005Tempada 2005-06 : Tódolos Xogadores da Xornada""Non quero pensar nunha man negra, mais pregúntome que está a pasar""o orixinalRaúl López, orgulloso dos xogadores, presume da boa saúde económica do BreogánJulio González confirma que cesa como presidente del BreogánHomenaxe a Lisardo GómezA tempada do rexurdimento celesteEntrevista a Lisardo GómezEl COB dinamita el Pazo para forzar el quinto (69-73)Cafés Candelas, patrocinador del CB Breogán"Suso Lázare, novo presidente do Breogán"o orixinalCafés Candelas Breogán firma el mayor triunfo de la historiaEl Breogán realizará 17 homenajes por su cincuenta aniversario"O Breogán honra ao seu fundador e primeiro presidente"o orixinalMiguel Giao recibiu a homenaxe do PazoHomenaxe aos primeiros gladiadores celestesO home que nos amosa como ver o Breo co corazónTita Franco será homenaxeada polos #50anosdeBreoJulio Vila recibirá unha homenaxe in memoriam polos #50anosdeBreo"O Breogán homenaxeará aos seus aboados máis veteráns"Pechada ovación a «Capi» Sanmartín e Ricardo «Corazón de González»Homenaxe por décadas de informaciónPaco García volve ao Pazo con motivo do 50 aniversario"Resultados y clasificaciones""O Cafés Candelas Breogán, campión da Copa Princesa""O Cafés Candelas Breogán, equipo ACB"C.B. Breogán"Proxecto social"o orixinal"Centros asociados"o orixinalFicha en imdb.comMario Camus trata la recuperación del amor en 'La vieja música', su última película"Páxina web oficial""Club Baloncesto Breogán""C. B. Breogán S.A.D."eehttp://www.fegaba.com

                                          Vilaño, A Laracha Índice Patrimonio | Lugares e parroquias | Véxase tamén | Menú de navegación43°14′52″N 8°36′03″O / 43.24775, -8.60070

                                          Cegueira Índice Epidemioloxía | Deficiencia visual | Tipos de cegueira | Principais causas de cegueira | Tratamento | Técnicas de adaptación e axudas | Vida dos cegos | Primeiros auxilios | Crenzas respecto das persoas cegas | Crenzas das persoas cegas | O neno deficiente visual | Aspectos psicolóxicos da cegueira | Notas | Véxase tamén | Menú de navegación54.054.154.436928256blindnessDicionario da Real Academia GalegaPortal das Palabras"International Standards: Visual Standards — Aspects and Ranges of Vision Loss with Emphasis on Population Surveys.""Visual impairment and blindness""Presentan un plan para previr a cegueira"o orixinalACCDV Associació Catalana de Cecs i Disminuïts Visuals - PMFTrachoma"Effect of gene therapy on visual function in Leber's congenital amaurosis"1844137110.1056/NEJMoa0802268Cans guía - os mellores amigos dos cegosArquivadoEscola de cans guía para cegos en Mortágua, PortugalArquivado"Tecnología para ciegos y deficientes visuales. Recopilación de recursos gratuitos en la Red""Colorino""‘COL.diesis’, escuchar los sonidos del color""COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device"o orixinal"Sistema de desarrollo de sinestesia color-sonido para invidentes utilizando un protocolo de audio""Enseñanza táctil - geometría y color. Juegos didácticos para niños ciegos y videntes""Sistema Constanz"L'ocupació laboral dels cecs a l'Estat espanyol està pràcticament equiparada a la de les persones amb visió, entrevista amb Pedro ZuritaONCE (Organización Nacional de Cegos de España)Prevención da cegueiraDescrición de deficiencias visuais (Disc@pnet)Braillín, un boneco atractivo para calquera neno, con ou sen discapacidade, que permite familiarizarse co sistema de escritura e lectura brailleAxudas Técnicas36838ID00897494007150-90057129528256DOID:1432HP:0000618D001766C10.597.751.941.162C97109C0155020