Been resolving some problems at work lately with respect to full disks. One of our charters is to manage the ConfigMgr cache sizes on each machine to ensure that the packages we need to get replicated, actually get replicated out to the right machines at the right time.
But we’ve been getting some feedback about one 3rd party SCCM caching tool failing in some scenarios. Was it really the 3rd party tool failing, or some other factor?
Well we looked at the problem and found:
- Machines with a modest 120GB SSD Drive (most machines have a more robust 250GB SSD)
- Configuration Manager Application Install packages that are around 10-5GB (yowza!)
- Users who leave too much… crap laying around their desktop.
- And several other factors that have contributed to disks getting full.
Golly, when I try to install an application package that requires 12GB to install, and there is only 10GB free, it fails.
I wanted to get some data for machines that are full: What is using up the disk space? But it’s a little painful searching around a disk for directories that are larger than they should be.
One of my favorite tools is “WinDirStat” which produces a great graphical representation of a disk, allowing you to visualize what directories are taking up the most space, and which files are the largest. http://windirstat.net
Additionally I also like the “du.exe” tool from SysInternals. https://live.sysinternals.com/du.exe
I wrap it up in a custom batch script file
@%~dps0du.exe -l 1 -q -accepteula %*
and it produces output that looks like:
PS C:\Users> dudir 263,122 C:\Users\Administrator 1,541 C:\Users\Default 7,473,508 C:\Users\keith 4,173 C:\Users\Public 7,742,345 C:\Users Files: 27330 Directories: 5703 Size: 7,928,161,747 bytes Size on disk: 7,913,269,465 bytes
Cool, however, I wanted something that I could run remotely, and that would give me just the most interesting directories, say everything over 1GB, or something configurable like that.
So a tool was born.
The script will enumerate through all files on a local machine and return the totals. Along the way we can add in rules to “Group” interesting directories and output the results.
So, say we want to know if there are any folders under “c:\program files (x86)\Adobe\*” that are larger than 1GB. For the most part, we don’t care about Adobe Reader, since it’s under 1GB, but everything else would be interesting. Stuff like that.
We have a default set of rules built into the script, but you can pass a new set of rules into the script using a *.csv file ( I use excel )
|C:\Program Files (x86)||0|
|C:\Program Files (x86)\Adobe\*||1000|
|C:\Program Files (x86)\*||1000|
The machine isn’t too interesting (it’s my home machine not work machine)
I’m still looking into tweaks and other things to modify in the rules to make the output more interesting.
- Should I exclude \windows\System32 directories under X size?
If you have feedback, let me know