__/ [David Bolt] on Sunday 06 November 2005 20:23 \__
> On Sun, 6 Nov 2005, Paul F. Johnson <paul@xxxxxxxxxxxxxxxxxxxxxx>
> wrote:-
>
>>Hi,
>>
>>I'm trying to sort through about 3Gb of data files recovered from a hard
>>drive and in particular, am looking for certain phrases in a file (such
>>as GIF89a, JFIF and one from a work processor).
>>
>>How would I grep through an entire directory of files and then move the
>>ones containing the lines to another directory for examination?
>
> There's probably much more elegant, shorter, and/or quicker methods, but
> here's one that should do the job:
>
> grep -a -l "JFIF" * | \
> awk -F: '{print $1}'| \
> while read i ; do cp "$i" /wherever/you/want/the/files ; done
>
> Just replace JFIF to match other identifying strings, and the
> destination, as required.
>
>
> Regards,
> David Bolt
I suspect that the above assumes all files reside in the same directory. Let
us say that all 3 lines were saved under a file named move_jfif.sh. You
could then use global.sh (see below)
=========
global.sh
=========
# global - execute command in all subdirectories
exec 3<&0 # save standard input
find . -type d -print | # print all directory names
while read dirname
do
(cd $dirname
exec 0<&3 # restore standard input
"$@" # run command
)
done
exec 3<&- # close file descriptor
========
Then make the call:
/home/user/FULL_PATH_IS_IMPORTANT_AS_YOU_CHANGE_LOCATIONS/global.sh
/home/user/YOUT_PATH/move_jfif.sh
I guess this would cover more complex circumstances. By no means do I
criticise David's advice, which is excellent.
Hope it helps,
Roy
--
Roy S. Schestowitz | Data lacking semantics is currency in an island
http://Schestowitz.com | SuSE Linux | PGP-Key: 0x74572E8E
|
|