I’m a searcher. I always grep for things. I use fun commands like:
vim -p `grep -lR xyz .`
Which finds all files containing xyz and automatically opens them as tabs in vim ( doesn’t work too well when a ton of files are found though 😉 ). At times however, grep’s just… a bit slow.. A client of mine has a site with a considerable amount of data in the site’s directory. I needed to find any file containing “16777216” — 16MB in bytes because he’s increasing his upload limit to 1GB. I tried a grep on the root of the site for that and…. went in search for a faster way after about 4 minutes of grepping — command was:
grep -lR "16777216" .
and.. after about 5 minutes, I gave up and started searching for a faster method. I found a post on stackoverflow which showed that simply by piping php files I’d make the grep considerably faster:
time find ./ -name "*" -print0 | xargs -0 -n1 -P8 grep -H "16777216" >> 16mb.find real 0m0.409s user 0m0.517s sys 0m1.470s
So there you have it.. giving up after 5 minutes versus a full list within about a second.. WIN 🙂
By the way.. in the command above, you’ll see -P8 after xargs. That allows you to tell xargs to utilize that many procs ( processors ):
--max-procs=max-procs, -P max-procs Run up to max-procs processes at a time; the default is 1. If max-procs is 0, xargs will run as many processes as possible at a time. Use the -n option with -P; otherwise chances are that only one exec will be done.