Best way to choose a random file from a directory in a shell script


Question

What is the best way to choose a random file from a directory in a shell script?

Here is my solution in Bash but I would be very interested for a more portable (non-GNU) version for use on Unix proper.

dir='some/directory'
file=`/bin/ls -1 "$dir" | sort --random-sort | head -1`
path=`readlink --canonicalize "$dir/$file"` # Converts to full path
echo "The randomly-selected file is: $path"

Anybody have any other ideas?

Edit: lhunath makes a good point about parsing ls. I guess it comes down to whether you want to be portable or not. If you have the GNU findutils and coreutils then you can do:

find "$dir" -maxdepth 1 -mindepth 1 -type f -print0 \
  | sort --zero-terminated --random-sort \
  | sed 's/\d000.*//g/'

Whew, that was fun! Also it matches my question better since I said "random file". Honsetly though, these days it's hard to imagine a Unix system deployed out there having GNU installed but not Perl 5.

1
44
3/31/2009 5:52:34 PM

Accepted Answer

files=(/my/dir/*)
printf "%s\n" "${files[RANDOM % ${#files[@]}]}"

And don't parse ls. Read http://mywiki.wooledge.org/ParsingLs

Edit: Good luck finding a non-bash solution that's reliable. Most will break for certain types of filenames, such as filenames with spaces or newlines or dashes (it's pretty much impossible in pure sh). To do it right without bash, you'd need to fully migrate to awk/perl/python/... without piping that output for further processing or such.

58
7/19/2011 12:23:30 PM

Is "shuf" not portable?

shuf -n1 -e /path/to/files/*

or find if files are deeper than one directory:

find /path/to/files/ -type f | shuf -n1

it's part of coreutils but you'll need 6.4 or newer to get it... so RH/CentOS does not include it.


Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow
Icon