In this case, I've decided to find directories in two categories:
- directories older than 60 days
- directories older than 30 but less than 60 days.
Looking through the interwebs I found that there are many examples of how to do some of this but no full example. Here is what I came up with:
find <start path> -mtime +60 -type d -maxdepth 1
| grep <directory of interest> | xargs du -bs |
The above uses find to look for directories under "start path" whose age is 60 days or more (technically counting from yesterday). Note that mtime is "modified time", which for directories is loosely the time it was created. This is especially true when that directory is at the top of a large tree. The find result is piped through grep to narrow it down to directories I know contain lots of data. This in turn is piped to du to count how much data and finally to sort to reverse order the list.
To find data between 30 and 60 days, simply replace the -mtime 60 with:
-mtime +30 -mtime -60
In projects that take many months, like ones I work on, there are many people who, for whatever reason, some legitimate some less so, keep data around for what seems like long after it is useful. These commands, which stand on other find examples I came across, help me determine who owns the oldest and largest data.
Now if only this was actually my job!