Zsh Mailing List Archive
Messages sorted by:
Reverse Date,
Date,
Thread,
Author
Re: backing up with zsh
- X-seq: zsh-users 1469
- From: TGAPE! <tgape@xxxxxxxxxxxxx>
- To: zsh-users@xxxxxxxxxxxxxxx
- Subject: Re: backing up with zsh
- Date: Sat, 18 Apr 1998 09:36:39 +0000 (GMT)
- In-reply-to: <199804172020.PAA27672@xxxxxxxxxxxxxxxxx> from "Rob Windsor" at Apr 17, 98 03:20:29 pm
Rob Windsor wrote:
> Verily did Sven Guckes write:
>
>> I am trying to use the zsh to back my webpages.
>> I just want to backup text files, ie no pictures.
>
>> They webpages all reside in "~/.P". The pictures are within ~/.P/pics/ -
>> excluding this one dir is not a problem, but the problem is that some of
>> the pictures are scattered across the subdir of ~/.P still.
>> How do I exclude these pictures?
>
>> Possible filename extensions for pictures are: bmp gif jpg jpeg
>> But how do I specify these as case insensitive?
>> Do I have to list all possible combinations of upper- and lowercase spelling?
<snip&splice>
>> I may also need a restriction for file size (for all files), say 100K at
>> most.
>
> possibly.
>
> A good start would be:
>
> cd ~/.P ; find . -type f \! \( -name '*.gif' -a -name '*.jpg' -a -name '*.jpeg' -a -name '*.bmp' \) -print | xargs cvvzf /tmp/guckes.web.980417.tar.gz
A better start would be:
gtar czf /tmp/guckes.web.980417.tgz `cd ~/.P; find *(D) \! -type d -size 101k \
\! \( -iname '*.gif' -o -iname '*.jpg' -o -iname '*.jpeg' -o \
-iname '*.bmp' \) -print`
Also, this is larger (though more efficient, I hate to admit) than the
zsh
gtar czf /tmp/guckes.web.19980418.tgz \
**/*~*.([Gg][Ii][Ff]|[bB][mM][pP]|[jJ][pP][gG]|[jJ][pP][eE][gG])(DL-102401)
which is already fairly insane. (Of course, I use find jobs more
complicated than that every few months or so. However, I'm trying to
get some stars for my blue pointy hat.)
Hmm...
cd ~/.P;for x in **/*(DL-102401^/)
do
if [ $(file $x |grep -c picture) -eq 0 ]; then
tar cf - $x
fi
done | gzip -9 > /tmp/guckes.web.`date +%Y%m%d`.tgz
would be much more general; it's more keystrokes, though, and it'll run
slower than a siamese cat pulling a 1-ton stone block up a pyramid.
(Well, OK, maybe not *quite* that slow, but pretty damn slow.)
zsh is fairly resistant to the 'line too long' problem, it takes a
fairly large expansion to trigger it, and some cases appears to be
completely licked.
If you really need multiple v options to gtar, I'm not the guy you need
to talk to, I don't have a couch.
Uh,
~/.P;for x in **/*(D)
do
if [ $x != ${x#pics} && $(file $x | grep -c picture) -gt 0 ]; then
n=pics/${x##*/}
ln $x $n
for y in **/*(D.)
do
sed "s@$x@$n@g" $y > /tmp/nopics.$$
mv /tmp/nopics.$$ $y
done
rm $x
fi
done
should go a long way to moving all your pictures and fixing your links.
Other people's links, however, are another matter. I would suggest
backing up your webpage (probably including pictures this time) before
hazarding it. (If you're curious about the ln followed by rm, I work in
a 24x7x365 environment, where second outages can be costly.)
Ed
Messages sorted by:
Reverse Date,
Date,
Thread,
Author