So much Tech and so little time

Server 2012

First an important point about disabling dedup (via GUI or PowerShell), when you disable it only stops further deduplication from occurring i.e data that has already been deduplicated will remain deduplicated

If you want to “move” the data back to the original files and out of the deduplication store (Chunk Store) you need to use powershell command

PowerShell

1

start-dedupjob-Volume<VolumeLetter>-TypeUnoptimization

You can check the status on where this is at by using

PowerShell

1

get-dedupjob

Here’s another gotcha, chunk size (love that name) will not get smaller until you run two more commands, GarbageCollection and Scrubbing. GargabeCollection will find and remove unreferenced chunks and scrubbing will perform an integrity check but this wont work unless dedup is on….so enable dedup:

I posted about Microsoft Dedup recently and thought I should mention how to setup dedupe:

Data deduplication is a feature that allows space reduction on a data volume by removing duplicate copy of data and replacing it with a reference file that looks exactly the same to the end user.

Microsoft does not recommend dedup on databases such as .edb, .mdf and .ldf files. This feature help IT admins reduce storage costs if it’s applied to the right data such as File shares such as home folders.

To turn on deduplication feature, use below command (where E is the volume)

PowerShell

1

Enable-DedupVolumeE:

To Set the minimum file age before deduplication

PowerShell

1

Set-DedupvolumeE:-MinimumFileAgeDays30

To get a list of deduped volumes, run

PowerShell

1

Get-DedupVolume

To get dedup status, run

PowerShell

1

Get-DedupStatus

To start a dedup job manually, run

PowerShell

1

Start-DedupJobE:-TypeOptimization

To get current dedup schedule, run

PowerShell

1

Get-DedupSchedule

How to calculate dedup rate

Installing the “Data Deduplication” feature will automatically install the DDPEVAL.exe in c:\windows\system32 . This tool will allow you determine if deduplication is effective your data type.

This tool can be copied from any server running Windows Server 2012 R2 or Windows Server 2012 to systems running Windows Server 2012, Windows Server 2008 R2, or Windows 7. You can use it to determine the expected savings that you would get if deduplication is enabled on a particular volume.

Found a drive was running low on space today and on closer inspection with tree size I found that ChunkStore (brilliant name) was taking up the drive space:

Odd as it looks as dedup wasn’t working:

To fix it I ran the following PowerShell:

PowerShell

1

2

3

start-dedupjob-Volume<VolumeLetter>-TypeGarbageCollection

start-dedupjob-Volume<VolumeLetter>-TypeDataScrubbing

What does this do I hear you say, Garbage collection is the process to remove “data chunks” that are no longer referenced i.e. to remove references to deleted files and folders. This process deleted content to free up additional space. Data scrubbing checks integrity and validate the checksum data.

So why does UAC do this? UAC strips the admin credential from any un-elevated process. If you’re attempting to use an un-elevated process such as explorer to access a remote share using only admin credentials, UAC will strip the admin credentials from the process’ security token and the process will receive an “access denied” error. Which is stupid if you changing permissions.

Subscribe to my posts

Dont like Adverts

I know the adverts are annoying but I only use them to pay for hosting. If you find any of my posts useful then I’d really appreciate it if you could disable your ad-blocker and click a advert of two.
Or if not then you could buy me a coffee?
Thanks