Zsh Mailing List Archive
Messages sorted by:
Reverse Date,
Date,
Thread,
Author
Re: copying a million small files between disks?
- X-seq: zsh-users 12158
- From: "Anonymous bin ich" <ichbinanon@xxxxxxxxx>
- To: zsh-users@xxxxxxxxxx
- Subject: Re: copying a million small files between disks?
- Date: Thu, 1 Nov 2007 08:26:36 +0100
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=beta; h=domainkey-signature:received:received:message-id:date:from:to:subject:in-reply-to:mime-version:content-type:content-transfer-encoding:content-disposition:references; bh=8JkGUQoAoGY7E+/vRMZnM0Pv5GdQHkSq+EaU3paD9iI=; b=petcmJJu6tQf2x5ABadHWYEz/mUbXuP8viN41ckARxljGS1j4njlFQF042f5PQVwqj7KvvgV3m083nugi0MlkNNkhig+6oa1mayVUkoBXXZ+MSfaZWmhGePWXer8mWHoanm5xIAICS1A62Pv+mEm3Bty74Ha9bZ6j2gBxRpFEzc=
- Domainkey-signature: a=rsa-sha1; c=nofws; d=gmail.com; s=beta; h=received:message-id:date:from:to:subject:in-reply-to:mime-version:content-type:content-transfer-encoding:content-disposition:references; b=kH0bJy/ge3Rf7NEyuuSQkug1ik3+Rj0duP7MlZdiZS8KxGA5wiJ0obB42cWzOa0miC5NBcndikCehrx+rjAWJpaOnqQMNZyBHQP68uphR5rbmg+a6yfBn1AfRw0u6vjsMLHLQt3gY1ic8QgUwKNA6DgxhJGjlsbkNejxVJqNQUc=
- In-reply-to: <d6d6637f0710311841q12f8e2bem1ca8ee166289dfe0@xxxxxxxxxxxxxx>
- Mailing-list: contact zsh-users-help@xxxxxxxxxx; run by ezmlm
- References: <6a42eec70710311440u52556985wda68ce326f4a0417@xxxxxxxxxxxxxx> <d6d6637f0710311841q12f8e2bem1ca8ee166289dfe0@xxxxxxxxxxxxxx>
I ran into the same problem, and used find. I had to replace all
instances of ls in shell scripts:
ls *qj*par 2>|/dev/null|wc -l
with
find -maxdepth 1 -name \*qj\*par 2>|/dev/null|wc -l
Also, completion will be very slow so you might want to disable it if
pressing TAB is in your habit :)
On 11/1/07, Christopher Browne <cbbrowne@xxxxxxxxx> wrote:
> On 10/31/07, sam reckoner <sam.reckoner@xxxxxxxxx> wrote:
> > I'm not exaggerating. I have over one million small files that like to
> > move between disks. The problem is that even getting a directory
> > listing takes forever.
> >
> > Is there a best practice for this?
> >
> > I don't really need the directory listing, I just need to move all the
> > files. I have been using rsync, but that takes a very long time to
> > generate the list of files to be moved.
> >
> > Any alternatives?
>
> Yeah, I'd use find.
>
> The fundamental problem with ls, which you're clearly running into, is
> that when there are a million files, not only do you:
>
> a) Have to read the directory entries, but
>
> b) They will all have to be read into memory (in some form of array), and
>
> c) Then they get sorted (presumably generating a *second* array,
> though possibly not).
>
> You're getting your lunch eaten by b) and c).
>
> You might try:
> "find /path/where/all/the/files/are | xargs cp -I {}
> /path/that/is/destination"
>
> That will skip steps b and c.
> --
> http://linuxfinances.info/info/linuxdistributions.html
> "... memory leaks are quite acceptable in many applications ..."
> (Bjarne Stroustrup, The Design and Evolution of C++, page 220)
>
--
Regards,
Messages sorted by:
Reverse Date,
Date,
Thread,
Author