Untitled

# What is the best (fastest) method of finding large files that will cause a directory (i.e. /usr) to go to 100%. I check for core dumps and log files immediately but there has to be a better way to find huge files quickly.

# 3Thanks for any assistance.

#!/bin/sh

if [ $# -eq 2 ]

then

find $1 -type f -xdev -size +$(expr "$2" \* 2) -exec ll -d {} \;

else

echo "Usage : ${0##*/} "

fi

--snip--

Run it this way:

./bigfiles.sh /var 5000

RAW Paste Data

We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand