rsync - find - perl - Super fast delete a folder with large number of files


http://www.dxulab.com/wiki/superfastdeleteafolderwithlargenumberoffiles


posted Jul 4, 2016, 12:08 AM by Dong Xu   [ updated Jul 6, 2016, 9:20 AM]

http://www.slashroot.in/which-is-the-fastest-method-to-delete-files-in-linux

time perl -e 'for(<*>){((stat)[9]<(unlink))}'

time find ./ -type f -delete (the BEST)

time rsync -a --delete blanktest/ test/

TIME TAKEN
RM Command Is not capable of deleting large number of files
Find Command with -exec 14 Minutes for half a million files
Find Command with -delete 5 Minutes for half a million files
Perl 1 Minute for half a million files
RSYNC with -delete 2 Minute 56 seconds for half a million files
===============================================

Nice article. It inspired me to check results for find -delete, rsync and perl. I got another top. On my PC leader is find. Linux 4.2, Ubuntu 14.04, Intel i5 4 cores, Intel SSD 5xx series, EncFS encryption.

$ time for i in $(seq 1 500000); do echo testing >> $i.txt; done

real 1m13.263s
user 0m7.756s
sys 0m57.268s

Operation was repeated for each test with similar results.

$ time rsync --delete -av ../empty/ ./

real 4m5.197s
user 0m4.308s
sys 1m43.400s

$ time find ./ -delete

real 2m19.819s
user 0m1.044s
sys 0m59.100s

$ time perl -e 'unlink for ( <*> ) '
real 3m17.482s
user 0m2.524s
sys 1m29.196s

You can use this. You need to use glob for removing files:

unlink glob "'/tmp/*.*'";

These extra apostrophes are needed to handle filenames with spaces as one string.

Won't delete files with no "." in them. Won't delete files with a leading ".". No error reporting.

=========================
mkdir empty_dir
rsync -a --delete -P empty_dir/ your_folder/

Note "/" are needed!
===========================
Check a folder and list by sub-folder size
du -sh *|sort -h
============================


posted @ 2017-03-04 09:14  张同光  阅读(120)  评论(0编辑  收藏  举报