Postgres pg_dump using compression - Keeping schema only for some tablesUsing pg_dump with huge # of tables?How to handle one-way synchronization of two Oracle databases with no network connection?Postgres backupTwo mirrored database servers, or one twice as powerfulOnline Schema Change Strategies for Large MySQL TablesBest MySQL cache settings for 8gb RAM dedicated MySQL server using only InnoDB (5gb database)Possible to regularly export MS-SQL tables to another DB format on Linux without complete dumps?Postgres INSERT ERROR: permission denied for schema publicCan postgres go faster?pg_dump: [archiver (db)] connection to database failed: FATAL: Peer authentication failed for user “postgres”

Is it bad writing or bad story telling if first person narrative contains more information than the narrator knows?

Is a vertical stabiliser needed for straight line flight in a glider?

Was there a contingency plan in place if Little Boy failed to detonate?

How to evaluate sum with one million summands?

Why use steam instead of just hot air?

Is there a need for better software for writers?

Exception propagation: When to catch exceptions?

Why do unstable nuclei form?

Company threw a surprise party for the CEO, 3 weeks later management says we have to pay for it, do I have to?

Is ‘despite that’ right?

Is there any evidence to support the claim that the United States was "suckered into WW1" by Zionists, made by Benjamin Freedman in his 1961 speech

Names of the Six Tastes

Why is PerfectForwardSecrecy considered OK, when it has same defects as salt-less password hashing?

Is there an application which does HTTP PUT?

Why does increasing the sampling rate make implementing an anti-aliasing filter easier?

How to efficiently lower your karma

Was the Highlands Ranch shooting the 115th mass shooting in the US in 2019

How is CoreiX like Corei5, i7 is related to Haswell, Ivy Bridge?

My perfect evil overlord plan... or is it?

spatiotemporal regression

Are there variations of the regular runtimes of the Big-O-Notation?

What food production methods would allow a metropolis like New York to become self sufficient

Pre-1993 comic in which Wolverine's claws were turned to rubber?

Examples where existence is harder than evaluation



Postgres pg_dump using compression - Keeping schema only for some tables


Using pg_dump with huge # of tables?How to handle one-way synchronization of two Oracle databases with no network connection?Postgres backupTwo mirrored database servers, or one twice as powerfulOnline Schema Change Strategies for Large MySQL TablesBest MySQL cache settings for 8gb RAM dedicated MySQL server using only InnoDB (5gb database)Possible to regularly export MS-SQL tables to another DB format on Linux without complete dumps?Postgres INSERT ERROR: permission denied for schema publicCan postgres go faster?pg_dump: [archiver (db)] connection to database failed: FATAL: Peer authentication failed for user “postgres”






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.



Here is the workflow that I am currently using:



pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump

pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump

bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2


The workflow requires the following guidelines:



  1. Tables very_large_tbl and large_tbl need to export with no data (schema only).

  2. All other tables need schema + data exported.

  3. The end result must be compressed

My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c has MUCH faster overall process timings versus exporting and then compressing afterwards.



From what I can tell, you cannot use -s in pg_dump to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?



My question is this:



Is it possible to export "some" tables as schema only while using the -F c option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c?



Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.










share|improve this question




























    0















    I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.



    Here is the workflow that I am currently using:



    pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump

    pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump

    bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2


    The workflow requires the following guidelines:



    1. Tables very_large_tbl and large_tbl need to export with no data (schema only).

    2. All other tables need schema + data exported.

    3. The end result must be compressed

    My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c has MUCH faster overall process timings versus exporting and then compressing afterwards.



    From what I can tell, you cannot use -s in pg_dump to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?



    My question is this:



    Is it possible to export "some" tables as schema only while using the -F c option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c?



    Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.










    share|improve this question
























      0












      0








      0








      I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.



      Here is the workflow that I am currently using:



      pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump

      pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump

      bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2


      The workflow requires the following guidelines:



      1. Tables very_large_tbl and large_tbl need to export with no data (schema only).

      2. All other tables need schema + data exported.

      3. The end result must be compressed

      My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c has MUCH faster overall process timings versus exporting and then compressing afterwards.



      From what I can tell, you cannot use -s in pg_dump to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?



      My question is this:



      Is it possible to export "some" tables as schema only while using the -F c option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c?



      Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.










      share|improve this question














      I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.



      Here is the workflow that I am currently using:



      pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump

      pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump

      bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2


      The workflow requires the following guidelines:



      1. Tables very_large_tbl and large_tbl need to export with no data (schema only).

      2. All other tables need schema + data exported.

      3. The end result must be compressed

      My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c has MUCH faster overall process timings versus exporting and then compressing afterwards.



      From what I can tell, you cannot use -s in pg_dump to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?



      My question is this:



      Is it possible to export "some" tables as schema only while using the -F c option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c?



      Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.







      postgresql database pg-dump






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Apr 30 at 17:50









      emmdeeemmdee

      4111934




      4111934




















          1 Answer
          1






          active

          oldest

          votes


















          1














          Well like always I just had to go back to the documentation and study it closely.



          pg_dump has a --exclude-table-data option.



          So to exclude data for a table named my_table it's as simple as --exclude-table-data=my_table



          This will dump the create info for the table, but will exclude any data within it.






          share|improve this answer























            Your Answer








            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "2"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f965308%2fpostgres-pg-dump-using-compression-keeping-schema-only-for-some-tables%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            1














            Well like always I just had to go back to the documentation and study it closely.



            pg_dump has a --exclude-table-data option.



            So to exclude data for a table named my_table it's as simple as --exclude-table-data=my_table



            This will dump the create info for the table, but will exclude any data within it.






            share|improve this answer



























              1














              Well like always I just had to go back to the documentation and study it closely.



              pg_dump has a --exclude-table-data option.



              So to exclude data for a table named my_table it's as simple as --exclude-table-data=my_table



              This will dump the create info for the table, but will exclude any data within it.






              share|improve this answer

























                1












                1








                1







                Well like always I just had to go back to the documentation and study it closely.



                pg_dump has a --exclude-table-data option.



                So to exclude data for a table named my_table it's as simple as --exclude-table-data=my_table



                This will dump the create info for the table, but will exclude any data within it.






                share|improve this answer













                Well like always I just had to go back to the documentation and study it closely.



                pg_dump has a --exclude-table-data option.



                So to exclude data for a table named my_table it's as simple as --exclude-table-data=my_table



                This will dump the create info for the table, but will exclude any data within it.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Apr 30 at 18:44









                emmdeeemmdee

                4111934




                4111934



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Server Fault!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f965308%2fpostgres-pg-dump-using-compression-keeping-schema-only-for-some-tables%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Club Baloncesto Breogán Índice Historia | Pavillón | Nome | O Breogán na cultura popular | Xogadores | Adestradores | Presidentes | Palmarés | Historial | Líderes | Notas | Véxase tamén | Menú de navegacióncbbreogan.galCadroGuía oficial da ACB 2009-10, páxina 201Guía oficial ACB 1992, páxina 183. Editorial DB.É de 6.500 espectadores sentados axeitándose á última normativa"Estudiantes Junior, entre as mellores canteiras"o orixinalHemeroteca El Mundo Deportivo, 16 setembro de 1970, páxina 12Historia do BreogánAlfredo Pérez, o último canoneiroHistoria C.B. BreogánHemeroteca de El Mundo DeportivoJimmy Wright, norteamericano do Breogán deixará Lugo por ameazas de morteResultados de Breogán en 1986-87Resultados de Breogán en 1990-91Ficha de Velimir Perasović en acb.comResultados de Breogán en 1994-95Breogán arrasa al Barça. "El Mundo Deportivo", 27 de setembro de 1999, páxina 58CB Breogán - FC BarcelonaA FEB invita a participar nunha nova Liga EuropeaCharlie Bell na prensa estatalMáximos anotadores 2005Tempada 2005-06 : Tódolos Xogadores da Xornada""Non quero pensar nunha man negra, mais pregúntome que está a pasar""o orixinalRaúl López, orgulloso dos xogadores, presume da boa saúde económica do BreogánJulio González confirma que cesa como presidente del BreogánHomenaxe a Lisardo GómezA tempada do rexurdimento celesteEntrevista a Lisardo GómezEl COB dinamita el Pazo para forzar el quinto (69-73)Cafés Candelas, patrocinador del CB Breogán"Suso Lázare, novo presidente do Breogán"o orixinalCafés Candelas Breogán firma el mayor triunfo de la historiaEl Breogán realizará 17 homenajes por su cincuenta aniversario"O Breogán honra ao seu fundador e primeiro presidente"o orixinalMiguel Giao recibiu a homenaxe do PazoHomenaxe aos primeiros gladiadores celestesO home que nos amosa como ver o Breo co corazónTita Franco será homenaxeada polos #50anosdeBreoJulio Vila recibirá unha homenaxe in memoriam polos #50anosdeBreo"O Breogán homenaxeará aos seus aboados máis veteráns"Pechada ovación a «Capi» Sanmartín e Ricardo «Corazón de González»Homenaxe por décadas de informaciónPaco García volve ao Pazo con motivo do 50 aniversario"Resultados y clasificaciones""O Cafés Candelas Breogán, campión da Copa Princesa""O Cafés Candelas Breogán, equipo ACB"C.B. Breogán"Proxecto social"o orixinal"Centros asociados"o orixinalFicha en imdb.comMario Camus trata la recuperación del amor en 'La vieja música', su última película"Páxina web oficial""Club Baloncesto Breogán""C. B. Breogán S.A.D."eehttp://www.fegaba.com

                    Vilaño, A Laracha Índice Patrimonio | Lugares e parroquias | Véxase tamén | Menú de navegación43°14′52″N 8°36′03″O / 43.24775, -8.60070

                    Cegueira Índice Epidemioloxía | Deficiencia visual | Tipos de cegueira | Principais causas de cegueira | Tratamento | Técnicas de adaptación e axudas | Vida dos cegos | Primeiros auxilios | Crenzas respecto das persoas cegas | Crenzas das persoas cegas | O neno deficiente visual | Aspectos psicolóxicos da cegueira | Notas | Véxase tamén | Menú de navegación54.054.154.436928256blindnessDicionario da Real Academia GalegaPortal das Palabras"International Standards: Visual Standards — Aspects and Ranges of Vision Loss with Emphasis on Population Surveys.""Visual impairment and blindness""Presentan un plan para previr a cegueira"o orixinalACCDV Associació Catalana de Cecs i Disminuïts Visuals - PMFTrachoma"Effect of gene therapy on visual function in Leber's congenital amaurosis"1844137110.1056/NEJMoa0802268Cans guía - os mellores amigos dos cegosArquivadoEscola de cans guía para cegos en Mortágua, PortugalArquivado"Tecnología para ciegos y deficientes visuales. Recopilación de recursos gratuitos en la Red""Colorino""‘COL.diesis’, escuchar los sonidos del color""COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device"o orixinal"Sistema de desarrollo de sinestesia color-sonido para invidentes utilizando un protocolo de audio""Enseñanza táctil - geometría y color. Juegos didácticos para niños ciegos y videntes""Sistema Constanz"L'ocupació laboral dels cecs a l'Estat espanyol està pràcticament equiparada a la de les persones amb visió, entrevista amb Pedro ZuritaONCE (Organización Nacional de Cegos de España)Prevención da cegueiraDescrición de deficiencias visuais (Disc@pnet)Braillín, un boneco atractivo para calquera neno, con ou sen discapacidade, que permite familiarizarse co sistema de escritura e lectura brailleAxudas Técnicas36838ID00897494007150-90057129528256DOID:1432HP:0000618D001766C10.597.751.941.162C97109C0155020