What is the best way to run / manage multiple rsyncs on the same host?Best way to compare (diff) a full directory structure?Is rsync th best way to fix my problem?Best way to backup and restore millions of filesrsync to multiple destinations using same filelist?What is the best way to back up dedicated web server? (Amanda versus Rsync)Rsync process is killed abruptly during backup over SSHWhat is the best way to copy 9TB of data to multiple external drives, sequentially?Rsync creates a directory with the same name inside of destination directoryWhat is the best way to sync 2 identical centos servers in real timersync to much upload data
What did River say when she woke from her proto-comatose state?
Why don't countries like Japan just print more money?
How does the spell Remove Curse interact with a Sword of Vengeance?
Java TreeMap.floorKey() equivalent for std::map
Greeting with "Ho"
How is hair tissue mineral analysis performed?
What was the Shuttle Carrier Aircraft escape tunnel?
Why do even high-end cameras often still include normal (non-cross-type) AF sensors?
Employer wants to use my work email account after I quit
What's currently blocking the construction of the wall between Mexico and the US?
Who are the remaining King/Queenslayers?
Methodology: Writing unit tests for another developer
What does the hyphen "-" mean in "tar xzf -"?
What size of powerbank will I need to power a phone and DSLR for 2 weeks?
How large would a mega structure have to be to host 1 billion people indefinitely?
Drawing people along with x and y axis
Do I have any obligations to my PhD supervisor's requests after I have graduated?
What is the legal status of travelling with methadone in your carry-on?
What is the origin of Scooby-Doo's name?
Dates on degrees don’t make sense – will people care?
How can I politely work my way around not liking coffee or beer when it comes to professional networking?
Unusual mail headers, evidence of an attempted attack. Have I been pwned?
Is "Busen" just the area between the breasts?
Why do all the teams that I have worked with always finish a sprint without completion of all the stories?
What is the best way to run / manage multiple rsyncs on the same host?
Best way to compare (diff) a full directory structure?Is rsync th best way to fix my problem?Best way to backup and restore millions of filesrsync to multiple destinations using same filelist?What is the best way to back up dedicated web server? (Amanda versus Rsync)Rsync process is killed abruptly during backup over SSHWhat is the best way to copy 9TB of data to multiple external drives, sequentially?Rsync creates a directory with the same name inside of destination directoryWhat is the best way to sync 2 identical centos servers in real timersync to much upload data
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
I am trying to move several TB from multiple hosts to other hosts. Each host will have 8 different rsync commands specifying different directory structures and I will be replicating this across nine different hosts.
Below is an example of what I'll be running on each host (different destinations but identical structures):
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/backup* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/QA/backup* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
We plan on running this manually every night from 5pm - 5am (12 hour timeout) until it's synced to the new hardware, with a final run done to account for any changes for when we cut over officially.
Obviously, I'd rather not manually start up 72 different rsync jobs every night, and I'd like to figure out a simple way of killing them early if necessary.
Does anyone have any ideas on what would be a good way to manage this task?
rsync
add a comment |
I am trying to move several TB from multiple hosts to other hosts. Each host will have 8 different rsync commands specifying different directory structures and I will be replicating this across nine different hosts.
Below is an example of what I'll be running on each host (different destinations but identical structures):
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/backup* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/QA/backup* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
We plan on running this manually every night from 5pm - 5am (12 hour timeout) until it's synced to the new hardware, with a final run done to account for any changes for when we cut over officially.
Obviously, I'd rather not manually start up 72 different rsync jobs every night, and I'd like to figure out a simple way of killing them early if necessary.
Does anyone have any ideas on what would be a good way to manage this task?
rsync
1
I found aggregating all rsync commands into one, setting a bandwidth limit below link speed & just letting it run continuously to be far easier than working around time-of-day restrictions and wasting bandwidth with many parallel transfers.
– anx
Jun 5 at 20:41
add a comment |
I am trying to move several TB from multiple hosts to other hosts. Each host will have 8 different rsync commands specifying different directory structures and I will be replicating this across nine different hosts.
Below is an example of what I'll be running on each host (different destinations but identical structures):
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/backup* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/QA/backup* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
We plan on running this manually every night from 5pm - 5am (12 hour timeout) until it's synced to the new hardware, with a final run done to account for any changes for when we cut over officially.
Obviously, I'd rather not manually start up 72 different rsync jobs every night, and I'd like to figure out a simple way of killing them early if necessary.
Does anyone have any ideas on what would be a good way to manage this task?
rsync
I am trying to move several TB from multiple hosts to other hosts. Each host will have 8 different rsync commands specifying different directory structures and I will be replicating this across nine different hosts.
Below is an example of what I'll be running on each host (different destinations but identical structures):
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/backup* --exclude 'db1' --exclude 'db2' 10.2.2.16:/tmp/prod/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/engine_* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/client --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/var/lib/data/logs --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
sudo -u svc_unix timeout 43200 rsync -azve ssh /tmp01/QA/backup* --exclude 'db' 10.2.2.16:/tmp/QA/RSYNC_TEMP
We plan on running this manually every night from 5pm - 5am (12 hour timeout) until it's synced to the new hardware, with a final run done to account for any changes for when we cut over officially.
Obviously, I'd rather not manually start up 72 different rsync jobs every night, and I'd like to figure out a simple way of killing them early if necessary.
Does anyone have any ideas on what would be a good way to manage this task?
rsync
rsync
asked Jun 5 at 18:59
user2752794user2752794
63
63
1
I found aggregating all rsync commands into one, setting a bandwidth limit below link speed & just letting it run continuously to be far easier than working around time-of-day restrictions and wasting bandwidth with many parallel transfers.
– anx
Jun 5 at 20:41
add a comment |
1
I found aggregating all rsync commands into one, setting a bandwidth limit below link speed & just letting it run continuously to be far easier than working around time-of-day restrictions and wasting bandwidth with many parallel transfers.
– anx
Jun 5 at 20:41
1
1
I found aggregating all rsync commands into one, setting a bandwidth limit below link speed & just letting it run continuously to be far easier than working around time-of-day restrictions and wasting bandwidth with many parallel transfers.
– anx
Jun 5 at 20:41
I found aggregating all rsync commands into one, setting a bandwidth limit below link speed & just letting it run continuously to be far easier than working around time-of-day restrictions and wasting bandwidth with many parallel transfers.
– anx
Jun 5 at 20:41
add a comment |
1 Answer
1
active
oldest
votes
Are you looking for a way to script this? Will something like this work (issued from the destination with sudo -u svc_unix timeout 43200
preceding the script - also remove the echo syntax around the rsync command in line 17)?
#!/bin/bash
#Configs:
ip=("10.1.1.1" "10.1.1.2" "10.1.1.3" "10.1.1.4" "10.1.1.5" "10.1.1.6" "10.1.1.7" "10.1.1.8")
source_base="/tmp01/var/lib/data"
source_array=(engine client logs backup)
destination_base="/tmp"
destination="RSYNC_TEMP"
## Just match the elements of the array with the IP addresses in $ip[]
destination_array=(prod QA foo bar stuff stuff prod dev)
exclude_file="some.file"
echo "script start time: $(date)"
for i in $(seq 1 $#ip[*]); do
echo " start time for $ip[$((i -1))]: $(date)"
echo "rsync -azqe ssh $ip[$((i -1))]:$source_base/$source_array[0]_* :$source_base/$source_array[1] :$source_base/$source_array[2] :$source_base/$source_array[3] --exclude-from=$exclude_file localhost:$destination_base/$destination_array[$((i -1))]/$destination/"
echo "done with files coming from $ip[$((i -1))]"
echo "script end time: $(date)"
done
Scripting is likely the way I plan to go. Will this run them in parallel or will it wait for one to finish before continuing in the loop?
– user2752794
Jun 5 at 20:42
serially - no parallel - you could also have a list of exclusions for each source in its own exclusion file, then make that argument another array.
– JayRugMan
Jun 5 at 20:51
you can open a screen session tab on the target for each source and run a script for each host in it's own tab. A .screenrc file in your home directory will make that easy to manage.
– JayRugMan
Jun 5 at 22:19
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "2"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f970272%2fwhat-is-the-best-way-to-run-manage-multiple-rsyncs-on-the-same-host%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Are you looking for a way to script this? Will something like this work (issued from the destination with sudo -u svc_unix timeout 43200
preceding the script - also remove the echo syntax around the rsync command in line 17)?
#!/bin/bash
#Configs:
ip=("10.1.1.1" "10.1.1.2" "10.1.1.3" "10.1.1.4" "10.1.1.5" "10.1.1.6" "10.1.1.7" "10.1.1.8")
source_base="/tmp01/var/lib/data"
source_array=(engine client logs backup)
destination_base="/tmp"
destination="RSYNC_TEMP"
## Just match the elements of the array with the IP addresses in $ip[]
destination_array=(prod QA foo bar stuff stuff prod dev)
exclude_file="some.file"
echo "script start time: $(date)"
for i in $(seq 1 $#ip[*]); do
echo " start time for $ip[$((i -1))]: $(date)"
echo "rsync -azqe ssh $ip[$((i -1))]:$source_base/$source_array[0]_* :$source_base/$source_array[1] :$source_base/$source_array[2] :$source_base/$source_array[3] --exclude-from=$exclude_file localhost:$destination_base/$destination_array[$((i -1))]/$destination/"
echo "done with files coming from $ip[$((i -1))]"
echo "script end time: $(date)"
done
Scripting is likely the way I plan to go. Will this run them in parallel or will it wait for one to finish before continuing in the loop?
– user2752794
Jun 5 at 20:42
serially - no parallel - you could also have a list of exclusions for each source in its own exclusion file, then make that argument another array.
– JayRugMan
Jun 5 at 20:51
you can open a screen session tab on the target for each source and run a script for each host in it's own tab. A .screenrc file in your home directory will make that easy to manage.
– JayRugMan
Jun 5 at 22:19
add a comment |
Are you looking for a way to script this? Will something like this work (issued from the destination with sudo -u svc_unix timeout 43200
preceding the script - also remove the echo syntax around the rsync command in line 17)?
#!/bin/bash
#Configs:
ip=("10.1.1.1" "10.1.1.2" "10.1.1.3" "10.1.1.4" "10.1.1.5" "10.1.1.6" "10.1.1.7" "10.1.1.8")
source_base="/tmp01/var/lib/data"
source_array=(engine client logs backup)
destination_base="/tmp"
destination="RSYNC_TEMP"
## Just match the elements of the array with the IP addresses in $ip[]
destination_array=(prod QA foo bar stuff stuff prod dev)
exclude_file="some.file"
echo "script start time: $(date)"
for i in $(seq 1 $#ip[*]); do
echo " start time for $ip[$((i -1))]: $(date)"
echo "rsync -azqe ssh $ip[$((i -1))]:$source_base/$source_array[0]_* :$source_base/$source_array[1] :$source_base/$source_array[2] :$source_base/$source_array[3] --exclude-from=$exclude_file localhost:$destination_base/$destination_array[$((i -1))]/$destination/"
echo "done with files coming from $ip[$((i -1))]"
echo "script end time: $(date)"
done
Scripting is likely the way I plan to go. Will this run them in parallel or will it wait for one to finish before continuing in the loop?
– user2752794
Jun 5 at 20:42
serially - no parallel - you could also have a list of exclusions for each source in its own exclusion file, then make that argument another array.
– JayRugMan
Jun 5 at 20:51
you can open a screen session tab on the target for each source and run a script for each host in it's own tab. A .screenrc file in your home directory will make that easy to manage.
– JayRugMan
Jun 5 at 22:19
add a comment |
Are you looking for a way to script this? Will something like this work (issued from the destination with sudo -u svc_unix timeout 43200
preceding the script - also remove the echo syntax around the rsync command in line 17)?
#!/bin/bash
#Configs:
ip=("10.1.1.1" "10.1.1.2" "10.1.1.3" "10.1.1.4" "10.1.1.5" "10.1.1.6" "10.1.1.7" "10.1.1.8")
source_base="/tmp01/var/lib/data"
source_array=(engine client logs backup)
destination_base="/tmp"
destination="RSYNC_TEMP"
## Just match the elements of the array with the IP addresses in $ip[]
destination_array=(prod QA foo bar stuff stuff prod dev)
exclude_file="some.file"
echo "script start time: $(date)"
for i in $(seq 1 $#ip[*]); do
echo " start time for $ip[$((i -1))]: $(date)"
echo "rsync -azqe ssh $ip[$((i -1))]:$source_base/$source_array[0]_* :$source_base/$source_array[1] :$source_base/$source_array[2] :$source_base/$source_array[3] --exclude-from=$exclude_file localhost:$destination_base/$destination_array[$((i -1))]/$destination/"
echo "done with files coming from $ip[$((i -1))]"
echo "script end time: $(date)"
done
Are you looking for a way to script this? Will something like this work (issued from the destination with sudo -u svc_unix timeout 43200
preceding the script - also remove the echo syntax around the rsync command in line 17)?
#!/bin/bash
#Configs:
ip=("10.1.1.1" "10.1.1.2" "10.1.1.3" "10.1.1.4" "10.1.1.5" "10.1.1.6" "10.1.1.7" "10.1.1.8")
source_base="/tmp01/var/lib/data"
source_array=(engine client logs backup)
destination_base="/tmp"
destination="RSYNC_TEMP"
## Just match the elements of the array with the IP addresses in $ip[]
destination_array=(prod QA foo bar stuff stuff prod dev)
exclude_file="some.file"
echo "script start time: $(date)"
for i in $(seq 1 $#ip[*]); do
echo " start time for $ip[$((i -1))]: $(date)"
echo "rsync -azqe ssh $ip[$((i -1))]:$source_base/$source_array[0]_* :$source_base/$source_array[1] :$source_base/$source_array[2] :$source_base/$source_array[3] --exclude-from=$exclude_file localhost:$destination_base/$destination_array[$((i -1))]/$destination/"
echo "done with files coming from $ip[$((i -1))]"
echo "script end time: $(date)"
done
edited Jun 5 at 20:49
answered Jun 5 at 20:35
JayRugManJayRugMan
11
11
Scripting is likely the way I plan to go. Will this run them in parallel or will it wait for one to finish before continuing in the loop?
– user2752794
Jun 5 at 20:42
serially - no parallel - you could also have a list of exclusions for each source in its own exclusion file, then make that argument another array.
– JayRugMan
Jun 5 at 20:51
you can open a screen session tab on the target for each source and run a script for each host in it's own tab. A .screenrc file in your home directory will make that easy to manage.
– JayRugMan
Jun 5 at 22:19
add a comment |
Scripting is likely the way I plan to go. Will this run them in parallel or will it wait for one to finish before continuing in the loop?
– user2752794
Jun 5 at 20:42
serially - no parallel - you could also have a list of exclusions for each source in its own exclusion file, then make that argument another array.
– JayRugMan
Jun 5 at 20:51
you can open a screen session tab on the target for each source and run a script for each host in it's own tab. A .screenrc file in your home directory will make that easy to manage.
– JayRugMan
Jun 5 at 22:19
Scripting is likely the way I plan to go. Will this run them in parallel or will it wait for one to finish before continuing in the loop?
– user2752794
Jun 5 at 20:42
Scripting is likely the way I plan to go. Will this run them in parallel or will it wait for one to finish before continuing in the loop?
– user2752794
Jun 5 at 20:42
serially - no parallel - you could also have a list of exclusions for each source in its own exclusion file, then make that argument another array.
– JayRugMan
Jun 5 at 20:51
serially - no parallel - you could also have a list of exclusions for each source in its own exclusion file, then make that argument another array.
– JayRugMan
Jun 5 at 20:51
you can open a screen session tab on the target for each source and run a script for each host in it's own tab. A .screenrc file in your home directory will make that easy to manage.
– JayRugMan
Jun 5 at 22:19
you can open a screen session tab on the target for each source and run a script for each host in it's own tab. A .screenrc file in your home directory will make that easy to manage.
– JayRugMan
Jun 5 at 22:19
add a comment |
Thanks for contributing an answer to Server Fault!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f970272%2fwhat-is-the-best-way-to-run-manage-multiple-rsyncs-on-the-same-host%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
I found aggregating all rsync commands into one, setting a bandwidth limit below link speed & just letting it run continuously to be far easier than working around time-of-day restrictions and wasting bandwidth with many parallel transfers.
– anx
Jun 5 at 20:41