Here's a common problem - I'm working along running tests and bam, disk is full - all running tests fail when the simulator can no longer write to disk. I won't get into the cost benefit analysis of simply adding more disks to the project - that's not my thing. I'm going to talk about how to find old directories so that the owners of those directories can be prompted to delete stuff they don't need.
In this case, I've decided to find directories in two categories:
- directories older than 60 days
- directories older than 30 but less than 60 days.
Looking through the interwebs I found that there are many examples of how to do some of this but no full example. Here is what I came up with:
find <start path> -mtime +60 -type d -maxdepth 1
| grep <directory of interest> | xargs du -bs |
sort -rn"
The above uses find to look for directories under "start path" whose age is 60 days or more (technically counting from yesterday). Note that mtime is "modified time", which for directories is loosely the time it was created. This is especially true when that directory is at the top of a large tree. The find result is piped through grep to narrow it down to directories I know contain lots of data. This in turn is piped to du to count how much data and finally to sort to reverse order the list.
To find data between 30 and 60 days, simply replace the -mtime 60 with:
-mtime +30 -mtime -60
In projects that take many months, like ones I work on, there are many people who, for whatever reason, some legitimate some less so, keep data around for what seems like long after it is useful. These commands, which stand on other find
examples I
came across, help me determine who owns the oldest and largest data.
Now if only this was actually my job!