That's excellent, thanks for the help. My coreutils is 5.97 - is this why the sort order of du -h isn't right?
–
ripper234Nov 12 '10 at 11:26

Yep, you'd have to settle for du /home | sort -rn with coreutils 5.97 or use some 'magic' with perl etc as demonstrated over on ServerFault (serverfault.com/q/62411/60012)
–
N JNov 12 '10 at 11:27

1

OTOH if there is a big sub-sub-directory its bloat will show multiple time (for that dir and each parent dir) at the top of the results, and IMHO that distracts from the true bloat. Using "ncdu" suggested below could help with that, I'm gonna try it. =)
–
lapoJan 13 '11 at 21:39

1

I find the -size option to "find" useful as well, as it lets you find all files under a certain. At least for GNU find, you can do something like: "find . -size +100M" to find files larger than 100M below the current directory.
–
gabe.Feb 2 '11 at 4:48

If you want a command-line tool, I prefer ncdu, an ncurses version of du. It scans the disk (or a given folder) and then shows the top-level space usages; you can select a given directory to get the corresponding summary for that directory, and go back without needing to reanalyze:

If you're ok with a GUI program, Filelight is the closest thing to WinDirStat I've found; it shows a graphical view of space consumption:

Like ncdu, Filelight lets you select a given directory to get the breakdown for that directory

Wow, I never realized that (I guess "Win" in the name should've been a giveaway). A coworker once asked me if there was a Linux version of WinSCP; I died inside a little
–
Michael Mrozek♦Nov 13 '10 at 3:27

Breaking it down, du shows disk usage; -s says print the total for each argument (each item in the current directory), -m says show the size in Megabytes. This makes it easier for sort to work; sort doesn't really understand the -h output. The -x ignores other filesystems; this is useful when trying to find space hogs in /var, and /var/spool/foo is a different filesystem.

Yes, but du /home on my systems returns tens of thousands of files; I rarely care what the (say) 100 largest of those files are; I typically want to know which subdirectories are taking up the most space.
–
P JoslinDec 3 '14 at 20:43