Zsh Mailing List Archive
Messages sorted by: Reverse Date, Date, Thread, Author

Re: [Bug] ZSH segmentation fault



    Hi Peter :)

 * Peter Stephenson <p.w.stephenson@xxxxxxxxxxxx> dixit:
> Hmm... you've made the shell use massive amounts of memory and it's
> crashed when it didn't have enough.
[...]
> The only general fix, or at least graceful get out, for crashes like
> this is for every memory access in zsh to be error checked and abort if
> it fails.  There are very, very many of these and it still doesn't help
> you run programmes requiring large amounts of memory.

    While I understand that running such scripts is very unusual, I
think that just segfaulting is not a correct way of dealing with errors.
For example, in my zsh 4.2.6 I get a segfault when trying to complete
very long file names (I reported that, but it's just an example). If I
try to run a script that eats all the memory, the shell will segfault
too. In the second case I'm the culprit, because really the shell is not
intended to do that job; in the first case I'm not guilty, but the shell
behaved the same. Moreover, even in the first case I think that it's
better to abort, that way you can try to reproduce the error in a
non-login shell and get a sensible error message when the shell crashes
instead of just a crash.

    Just my 0.02 euros, and of course I'm not critizising anything since
I'm not the one who has to fix every memory allocation O:)

    Raúl Núñez de Arenas Coronado

-- 
Linux Registered User 88736 | http://www.dervishd.net
It's my PC and I'll cry if I want to... RAmen!



Messages sorted by: Reverse Date, Date, Thread, Author