How to verify that a deduplication has taken place?Full-featured online backup providers for medium-sized enterprise?How does ZFS Block Level Deduplication fit with Variable Block Size?How to activate deduplication in Windows Server 2012?Mixing Volume Shadow Copy and Data Deduplication in Windows ServerWindows Server 2012 is writing 1 megabyte .LOG file in windowssystem32dhcp every few secondsMavericks permission issues with Windows Server deduplicated sharesWindows Server 2012 R2 Deduped 356GB to 1.32GBIs it safe to delete chunks.chk file while using SDFS?18.1 TB Drive Shows .75 TB Free but only 4.66 TB Deduped on Windows Server 2012 R2Is there a way to measure or change how fast a Windows Server 2012 rehydrates data that it previously deduplicated?
How do I gain back my faith in my PhD degree?
Why is this clock signal connected to a capacitor to gnd?
Bullying boss launched a smear campaign and made me unemployable
CAST throwing error when run in stored procedure but not when run as raw query
Why doesn't using multiple commands with a || or && conditional work?
How could indestructible materials be used in power generation?
How can I determine if the org that I'm currently connected to is a scratch org?
Can my sorcerer use a spellbook only to collect spells and scribe scrolls, not cast?
Avoiding the "not like other girls" trope?
Why was the shrinking from 8″ made only to 5.25″ and not smaller (4″ or less)?
Plagiarism or not?
Cursor Replacement for Newbies
Assassin's bullet with mercury
What method can I use to design a dungeon difficult enough that the PCs can't make it through without killing them?
Do scales need to be in alphabetical order?
Mathematica command that allows it to read my intentions
Examples of smooth manifolds admitting inbetween one and a continuum of complex structures
How much of data wrangling is a data scientist's job?
iPad being using in wall mount battery swollen
How can I deal with my CEO asking me to hire someone with a higher salary than me, a co-founder?
Should I tell management that I intend to leave due to bad software development practices?
Ambiguity in the definition of entropy
Should I cover my bicycle overnight while bikepacking?
Do UK voters know if their MP will be the Speaker of the House?
How to verify that a deduplication has taken place?
Full-featured online backup providers for medium-sized enterprise?How does ZFS Block Level Deduplication fit with Variable Block Size?How to activate deduplication in Windows Server 2012?Mixing Volume Shadow Copy and Data Deduplication in Windows ServerWindows Server 2012 is writing 1 megabyte .LOG file in windowssystem32dhcp every few secondsMavericks permission issues with Windows Server deduplicated sharesWindows Server 2012 R2 Deduped 356GB to 1.32GBIs it safe to delete chunks.chk file while using SDFS?18.1 TB Drive Shows .75 TB Free but only 4.66 TB Deduped on Windows Server 2012 R2Is there a way to measure or change how fast a Windows Server 2012 rehydrates data that it previously deduplicated?
Microsoft Windows Server 2012 and onwards offers a de-duplication service that periodically scans files, find identical chunks and removes excessive copies to save space.
To the user browsing the files, they should all look the same.
My problem is that I have a piece of software that's reading these files and failing, when it reads a file processed by de-duplication. I set up a windows server with de-duplication service to develop and test a fix for this but I am not sure if my test files are being deduplicated and if my fix is really working.
Is there something in the file metadata about any deduplication taken place? Or perhaps the de-duplication service has an accessible data base with the augmented files?
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
windows-server-2012 deduplication
add a comment |
Microsoft Windows Server 2012 and onwards offers a de-duplication service that periodically scans files, find identical chunks and removes excessive copies to save space.
To the user browsing the files, they should all look the same.
My problem is that I have a piece of software that's reading these files and failing, when it reads a file processed by de-duplication. I set up a windows server with de-duplication service to develop and test a fix for this but I am not sure if my test files are being deduplicated and if my fix is really working.
Is there something in the file metadata about any deduplication taken place? Or perhaps the de-duplication service has an accessible data base with the augmented files?
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
windows-server-2012 deduplication
add a comment |
Microsoft Windows Server 2012 and onwards offers a de-duplication service that periodically scans files, find identical chunks and removes excessive copies to save space.
To the user browsing the files, they should all look the same.
My problem is that I have a piece of software that's reading these files and failing, when it reads a file processed by de-duplication. I set up a windows server with de-duplication service to develop and test a fix for this but I am not sure if my test files are being deduplicated and if my fix is really working.
Is there something in the file metadata about any deduplication taken place? Or perhaps the de-duplication service has an accessible data base with the augmented files?
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
windows-server-2012 deduplication
Microsoft Windows Server 2012 and onwards offers a de-duplication service that periodically scans files, find identical chunks and removes excessive copies to save space.
To the user browsing the files, they should all look the same.
My problem is that I have a piece of software that's reading these files and failing, when it reads a file processed by de-duplication. I set up a windows server with de-duplication service to develop and test a fix for this but I am not sure if my test files are being deduplicated and if my fix is really working.
Is there something in the file metadata about any deduplication taken place? Or perhaps the de-duplication service has an accessible data base with the augmented files?
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
windows-server-2012 deduplication
windows-server-2012 deduplication
edited 2 days ago
Alejandro-2988924
10417
10417
asked Jun 6 '18 at 17:23
DraxDomaxDraxDomax
1193
1193
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
Deduplication is implemented as a filter driver on top of NTFS (and now ReFS) and should work transparent. You can always disable it for some particular file sets if it causes issues.
To get deduplication status stick with Get-DedupeStatus cmdlet. See:
https://docs.microsoft.com/en-us/powershell/module/deduplication/get-dedupstatus
There's a way to visualize what's there. See:
https://www.foldersizes.com/features/windowsdeduplicationdiskspace
You can exclude particular files from deduplication jobs. See:
https://docs.microsoft.com/en-us/windows-server/storage/data-deduplication/advanced-settings
ExcludeFileType is what you should look at.
ExcludeFileType File types that are excluded from optimization Array of file extensions Some file types, particularly multimedia or files that are already compressed, do not benefit very much from being optimized. This setting allows you to configure which types are excluded.
add a comment |
The inner workings of the de-dupe service are stored in System Volume Information on each volume, but my understanding is that there's nothing really human readable in there for you as far as what has been deduped and what hasn't. It happens at the block-level, not the file-level.
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
De-duplication occurs on a schedule. If you copy a file and immediately check the properties it will not have been de-duplicated yet. You can use Start-DedupJob to force a dedupe optimization on a specific volume for your testing scenario.
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "2"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f915494%2fhow-to-verify-that-a-deduplication-has-taken-place%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
Deduplication is implemented as a filter driver on top of NTFS (and now ReFS) and should work transparent. You can always disable it for some particular file sets if it causes issues.
To get deduplication status stick with Get-DedupeStatus cmdlet. See:
https://docs.microsoft.com/en-us/powershell/module/deduplication/get-dedupstatus
There's a way to visualize what's there. See:
https://www.foldersizes.com/features/windowsdeduplicationdiskspace
You can exclude particular files from deduplication jobs. See:
https://docs.microsoft.com/en-us/windows-server/storage/data-deduplication/advanced-settings
ExcludeFileType is what you should look at.
ExcludeFileType File types that are excluded from optimization Array of file extensions Some file types, particularly multimedia or files that are already compressed, do not benefit very much from being optimized. This setting allows you to configure which types are excluded.
add a comment |
Deduplication is implemented as a filter driver on top of NTFS (and now ReFS) and should work transparent. You can always disable it for some particular file sets if it causes issues.
To get deduplication status stick with Get-DedupeStatus cmdlet. See:
https://docs.microsoft.com/en-us/powershell/module/deduplication/get-dedupstatus
There's a way to visualize what's there. See:
https://www.foldersizes.com/features/windowsdeduplicationdiskspace
You can exclude particular files from deduplication jobs. See:
https://docs.microsoft.com/en-us/windows-server/storage/data-deduplication/advanced-settings
ExcludeFileType is what you should look at.
ExcludeFileType File types that are excluded from optimization Array of file extensions Some file types, particularly multimedia or files that are already compressed, do not benefit very much from being optimized. This setting allows you to configure which types are excluded.
add a comment |
Deduplication is implemented as a filter driver on top of NTFS (and now ReFS) and should work transparent. You can always disable it for some particular file sets if it causes issues.
To get deduplication status stick with Get-DedupeStatus cmdlet. See:
https://docs.microsoft.com/en-us/powershell/module/deduplication/get-dedupstatus
There's a way to visualize what's there. See:
https://www.foldersizes.com/features/windowsdeduplicationdiskspace
You can exclude particular files from deduplication jobs. See:
https://docs.microsoft.com/en-us/windows-server/storage/data-deduplication/advanced-settings
ExcludeFileType is what you should look at.
ExcludeFileType File types that are excluded from optimization Array of file extensions Some file types, particularly multimedia or files that are already compressed, do not benefit very much from being optimized. This setting allows you to configure which types are excluded.
Deduplication is implemented as a filter driver on top of NTFS (and now ReFS) and should work transparent. You can always disable it for some particular file sets if it causes issues.
To get deduplication status stick with Get-DedupeStatus cmdlet. See:
https://docs.microsoft.com/en-us/powershell/module/deduplication/get-dedupstatus
There's a way to visualize what's there. See:
https://www.foldersizes.com/features/windowsdeduplicationdiskspace
You can exclude particular files from deduplication jobs. See:
https://docs.microsoft.com/en-us/windows-server/storage/data-deduplication/advanced-settings
ExcludeFileType is what you should look at.
ExcludeFileType File types that are excluded from optimization Array of file extensions Some file types, particularly multimedia or files that are already compressed, do not benefit very much from being optimized. This setting allows you to configure which types are excluded.
answered Jun 10 '18 at 15:18
BaronSamedi1958BaronSamedi1958
7,44911128
7,44911128
add a comment |
add a comment |
The inner workings of the de-dupe service are stored in System Volume Information on each volume, but my understanding is that there's nothing really human readable in there for you as far as what has been deduped and what hasn't. It happens at the block-level, not the file-level.
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
De-duplication occurs on a schedule. If you copy a file and immediately check the properties it will not have been de-duplicated yet. You can use Start-DedupJob to force a dedupe optimization on a specific volume for your testing scenario.
add a comment |
The inner workings of the de-dupe service are stored in System Volume Information on each volume, but my understanding is that there's nothing really human readable in there for you as far as what has been deduped and what hasn't. It happens at the block-level, not the file-level.
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
De-duplication occurs on a schedule. If you copy a file and immediately check the properties it will not have been de-duplicated yet. You can use Start-DedupJob to force a dedupe optimization on a specific volume for your testing scenario.
add a comment |
The inner workings of the de-dupe service are stored in System Volume Information on each volume, but my understanding is that there's nothing really human readable in there for you as far as what has been deduped and what hasn't. It happens at the block-level, not the file-level.
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
De-duplication occurs on a schedule. If you copy a file and immediately check the properties it will not have been de-duplicated yet. You can use Start-DedupJob to force a dedupe optimization on a specific volume for your testing scenario.
The inner workings of the de-dupe service are stored in System Volume Information on each volume, but my understanding is that there's nothing really human readable in there for you as far as what has been deduped and what hasn't. It happens at the block-level, not the file-level.
I have already tried the obvious: create a file, copy that file in the same folder and then view the properties of the folder - but the size of the folder amounts to both files, while I was expecting it to amount to the size of only one file.
De-duplication occurs on a schedule. If you copy a file and immediately check the properties it will not have been de-duplicated yet. You can use Start-DedupJob to force a dedupe optimization on a specific volume for your testing scenario.
edited Jun 6 '18 at 22:16
answered Jun 6 '18 at 21:23
MDMarraMDMarra
92.9k28174314
92.9k28174314
add a comment |
add a comment |
Thanks for contributing an answer to Server Fault!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fserverfault.com%2fquestions%2f915494%2fhow-to-verify-that-a-deduplication-has-taken-place%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown