Postgres pg_dump using compression - Keeping schema only for some tablesUsing pg_dump with huge # of tables?How to handle one-way synchronization of two Oracle databases with no network connection?Postgres backupTwo mirrored database servers, or one twice as powerfulOnline Schema Change Strategies for Large MySQL TablesBest MySQL cache settings for 8gb RAM dedicated MySQL server using only InnoDB (5gb database)Possible to regularly export MS-SQL tables to another DB format on Linux without complete dumps?Postgres INSERT ERROR: permission denied for schema publicCan postgres go faster?pg_dump: [archiver (db)] connection to database failed: FATAL: Peer authentication failed for user “postgres”
Is it bad writing or bad story telling if first person narrative contains more information than the narrator knows?
Is a vertical stabiliser needed for straight line flight in a glider?
Was there a contingency plan in place if Little Boy failed to detonate?
How to evaluate sum with one million summands?
Why use steam instead of just hot air?
Is there a need for better software for writers?
Exception propagation: When to catch exceptions?
Why do unstable nuclei form?
Company threw a surprise party for the CEO, 3 weeks later management says we have to pay for it, do I have to?
Is ‘despite that’ right?
Is there any evidence to support the claim that the United States was "suckered into WW1" by Zionists, made by Benjamin Freedman in his 1961 speech
Names of the Six Tastes
Why is PerfectForwardSecrecy considered OK, when it has same defects as salt-less password hashing?
Is there an application which does HTTP PUT?
Why does increasing the sampling rate make implementing an anti-aliasing filter easier?
How to efficiently lower your karma
Was the Highlands Ranch shooting the 115th mass shooting in the US in 2019
How is CoreiX like Corei5, i7 is related to Haswell, Ivy Bridge?
My perfect evil overlord plan... or is it?
spatiotemporal regression
Are there variations of the regular runtimes of the Big-O-Notation?
What food production methods would allow a metropolis like New York to become self sufficient
Pre-1993 comic in which Wolverine's claws were turned to rubber?
Examples where existence is harder than evaluation
Postgres pg_dump using compression - Keeping schema only for some tables
Using pg_dump with huge # of tables?How to handle one-way synchronization of two Oracle databases with no network connection?Postgres backupTwo mirrored database servers, or one twice as powerfulOnline Schema Change Strategies for Large MySQL TablesBest MySQL cache settings for 8gb RAM dedicated MySQL server using only InnoDB (5gb database)Possible to regularly export MS-SQL tables to another DB format on Linux without complete dumps?Postgres INSERT ERROR: permission denied for schema publicCan postgres go faster?pg_dump: [archiver (db)] connection to database failed: FATAL: Peer authentication failed for user “postgres”
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.
Here is the workflow that I am currently using:
pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump
pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump
bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2
The workflow requires the following guidelines:
- Tables
very_large_tbl
andlarge_tbl
need to export with no data (schema only). - All other tables need schema + data exported.
- The end result must be compressed
My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c
has MUCH faster overall process timings versus exporting and then compressing afterwards.
From what I can tell, you cannot use -s
in pg_dump
to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?
My question is this:
Is it possible to export "some" tables as schema only while using the -F c
option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c
?
Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.
postgresql database pg-dump
add a comment |
I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.
Here is the workflow that I am currently using:
pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump
pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump
bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2
The workflow requires the following guidelines:
- Tables
very_large_tbl
andlarge_tbl
need to export with no data (schema only). - All other tables need schema + data exported.
- The end result must be compressed
My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c
has MUCH faster overall process timings versus exporting and then compressing afterwards.
From what I can tell, you cannot use -s
in pg_dump
to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?
My question is this:
Is it possible to export "some" tables as schema only while using the -F c
option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c
?
Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.
postgresql database pg-dump
add a comment |
I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.
Here is the workflow that I am currently using:
pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump
pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump
bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2
The workflow requires the following guidelines:
- Tables
very_large_tbl
andlarge_tbl
need to export with no data (schema only). - All other tables need schema + data exported.
- The end result must be compressed
My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c
has MUCH faster overall process timings versus exporting and then compressing afterwards.
From what I can tell, you cannot use -s
in pg_dump
to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?
My question is this:
Is it possible to export "some" tables as schema only while using the -F c
option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c
?
Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.
postgresql database pg-dump
I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression system moving forward due to it being much faster in my testing.
Here is the workflow that I am currently using:
pg_dump -U postgres -h example.com -T very_large_tbl -T large_tbl db > /tmp/mydbdump
pg_dump -U postgres -h example.com -s -t very_large_tbl -t large_tbl db >> /tmp/mydbdump
bzip2 -zc < /tmp/mydbdump > /tmp/compressed_dbdump.dat.bz2
The workflow requires the following guidelines:
- Tables
very_large_tbl
andlarge_tbl
need to export with no data (schema only). - All other tables need schema + data exported.
- The end result must be compressed
My current workflow follows these guidelines without issue. However in testing on other databases using pg_dump -F c
has MUCH faster overall process timings versus exporting and then compressing afterwards.
From what I can tell, you cannot use -s
in pg_dump
to "pick and choose" which tables get schema-only. It's all or nothing. Am I wrong here?
My question is this:
Is it possible to export "some" tables as schema only while using the -F c
option of pg_dump? Or perhaps is it possible to append to an existing dump that was compressed with -F c
?
Even if the process has to take place in 2 parts, that's fine and I'm open to "out of the box" workarounds. Just wondering if it's at all possible.
postgresql database pg-dump
postgresql database pg-dump
asked Apr 30 at 17:50
emmdeeemmdee
4111934
4111934
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Well like always I just had to go back to the documentation and study it closely.
pg_dump has a --exclude-table-data
option.
So to exclude data for a table named my_table
it's as simple as --exclude-table-data=my_table
This will dump the create info for the table, but will exclude any data within it.
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "2"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f965308%2fpostgres-pg-dump-using-compression-keeping-schema-only-for-some-tables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Well like always I just had to go back to the documentation and study it closely.
pg_dump has a --exclude-table-data
option.
So to exclude data for a table named my_table
it's as simple as --exclude-table-data=my_table
This will dump the create info for the table, but will exclude any data within it.
add a comment |
Well like always I just had to go back to the documentation and study it closely.
pg_dump has a --exclude-table-data
option.
So to exclude data for a table named my_table
it's as simple as --exclude-table-data=my_table
This will dump the create info for the table, but will exclude any data within it.
add a comment |
Well like always I just had to go back to the documentation and study it closely.
pg_dump has a --exclude-table-data
option.
So to exclude data for a table named my_table
it's as simple as --exclude-table-data=my_table
This will dump the create info for the table, but will exclude any data within it.
Well like always I just had to go back to the documentation and study it closely.
pg_dump has a --exclude-table-data
option.
So to exclude data for a table named my_table
it's as simple as --exclude-table-data=my_table
This will dump the create info for the table, but will exclude any data within it.
answered Apr 30 at 18:44
emmdeeemmdee
4111934
4111934
add a comment |
add a comment |
Thanks for contributing an answer to Server Fault!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f965308%2fpostgres-pg-dump-using-compression-keeping-schema-only-for-some-tables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown