UNIX command to list folders with file counts


I want to get a list of folders at the current level (not including their subfolders) and simply print the folder name and a count of the number of files in the folder (preferably filtering to *.jpg if possible).

Is this possible in the standard bash shell? ls -l prints about everything but the file count :)

3/10/2009 1:56:44 AM

Accepted Answer

I've come up with this one:

find -maxdepth 1 -type d | while read dir; do 
    count=$(find "$dir" -maxdepth 1 -iname \*.jpg | wc -l)
    echo "$dir ; $count"

Drop the second -maxdepth 1 if the search within the directories for jpg files should be recursive considering sub-directories. Note that that only considers the name of the files. You could rename a file, hiding that it is a jpg picture. You can use the file command to do a guess on the content, instead (now, also searches recursively):

find -mindepth 1 -maxdepth 1 -type d | while read dir; do 
    count=$(find "$dir" -type f | xargs file -b --mime-type | 
            grep 'image/jpeg' | wc -l)
    echo "$dir ; $count"

However, that is much slower, since it has to read part of the files and eventually interpret what they contain (if it is lucky, it finds a magic id at the start of the file). The -mindepth 1 prevents it from printing . (the current directory) as another directory that it searches.

3/10/2009 2:23:16 AM

I found this question after I'd already figured out my own similar script. It seems to fit your conditions and is very flexible so I thought I'd add it as an answer.


  • can be grouped to any depth (0 for ., 1 for first level subdirectories, etc.)
  • prints pretty output
  • no loop, and only one find command, so it's a bit faster on large directories
  • can still be tuned to add custom filters (maxdepth to make it non-recursive, file name pattern)

Raw code:

  find -P . -type f | rev | cut -d/ -f2- | rev | \
      cut -d/ -f1-2 | cut -d/ -f2- | sort | uniq -c

Wrapped into a function and explained:

fc() {
  # Usage: fc [depth >= 0, default 1]
  # 1. List all files, not following symlinks.
  #      (Add filters like -maxdepth 1 or -iname='*.jpg' here.)
  # 2. Cut off filenames in bulk. Reverse and chop to the
  #      first / (remove filename). Reverse back.
  # 3. Cut everything after the specified depth, so that each line
  #      contains only the relevant directory path
  # 4. Cut off the preceeding '.' unless that's all there is.
  # 5. Sort and group to unique lines with count.

  find -P . -type f \
      | rev | cut -d/ -f2- | rev \
      | cut -d/ -f1-$((${1:-1}+1)) \
      | cut -d/ -f2- \
      | sort | uniq -c

Produces output like this:

$ fc 0
1668 .

$ fc # depth of 1 is default
   6 .
   3 .ssh
  11 Desktop
1054 Music
 550 Pictures

Of course with the number first it can be piped to sort:

$ fc | sort
   3 .ssh
   6 .
  11 Desktop
 550 Pictures
1054 Music

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow