2

I've a file with a list of files in different directories and want to find the oldest one. It feels like something that should be easy with some shell scripting but I don't know how to approach this. I'm sure it's really easy in perl and other scripting languages but I'd really like to know if I've missed some obvious bash solution.

Example of the contents of the source file:

/home/user2/file1  
/home/user14/tmp/file3  
/home/user9/documents/file9
SriniV
  • 10,123
  • 14
  • 53
  • 81
ollybee
  • 147
  • 5

3 Answers3

3
#!/bin/sh

while IFS= read -r file; do
    [ "${file}" -ot "${oldest=$file}" ] && oldest=${file}
done < filelist.txt

echo "the oldest file is '${oldest}'"
Adrian Frühwirth
  • 35,499
  • 8
  • 54
  • 69
2

You can use stat to find the last modification time of each file, looping over your source file:

oldest=5555555555
while read file; do
    modtime=$(stat -c %Y "$file")
    [[ $modtime -lt $oldest ]] && oldest=$modtime && oldestf="$file"
done < sourcefile.txt
echo "Oldest file: $oldestf"

This uses the %Y format of stat, which is the last modification time. You could also use %X for last access time, or %Z for last change time.

Josh Jolly
  • 9,562
  • 1
  • 34
  • 50
1

Use find() to find the oldest file:

find /home/ -type f -printf '%T+ %p\n' | sort | head -1 | cut -d' ' -f2-

And with source file:

find $(cat /path/to/source/file) -type f -printf '%T+ %p\n' | sort | head -1 | cut -d' ' -f2-
steveteuber
  • 182
  • 6