Professional Web Applications Themes

recommendations for which shell to use - SCO

On Tue, Nov 18, 2003, Mike Brown wrote:  This is probably the OS and/or file system, not the shell. Bill -- INTERNET: COM Bill Campbell; Celestial Software LLC UUCP: camco!bill PO Box 820; 6641 E. Mercer Way FAX: (206) 232-9186 Mercer Island, WA 98040-0820; (206) 236-1676 URL: http://www.celestial.com/ Basic Definitions of Science: If it's green or wiggles, it's biology. If it stinks, it's chemistry. If it doesn't work, it's physics....

  1. #1

    Default Re: recommendations for which shell to use

    On Tue, Nov 18, 2003, Mike Brown wrote: 

    This is probably the OS and/or file system, not the shell.

    Bill
    --
    INTERNET: COM Bill Campbell; Celestial Software LLC
    UUCP: camco!bill PO Box 820; 6641 E. Mercer Way
    FAX: (206) 232-9186 Mercer Island, WA 98040-0820; (206) 236-1676
    URL: http://www.celestial.com/

    Basic Definitions of Science:
    If it's green or wiggles, it's biology.
    If it stinks, it's chemistry.
    If it doesn't work, it's physics.
    Bill Guest

  2. #2

    Default Re: recommendations for which shell to use

    Bill Campbell wrote: 
    >
    > This is probably the OS and/or file system, not the shell.
    >
    > Bill
    > --
    > INTERNET: COM Bill Campbell; Celestial Software LLC
    > UUCP: camco!bill PO Box 820; 6641 E. Mercer Way
    > FAX: (206) 232-9186 Mercer Island, WA 98040-0820; (206) 236-1676
    > URL: http://www.celestial.com/
    >
    > Basic Definitions of Science:
    > If it's green or wiggles, it's biology.
    > If it stinks, it's chemistry.
    > If it doesn't work, it's physics.[/ref]

    And to continue ... If it doesn't work till the next version, it's computer science.

    The OS can handle > 2GB, but apparently the shell redirection cannot,
    ie cp file1 file2 is okay, but cat file1 >> file2 is not.

    Some of the open source shells compile on UW, I was wondering if
    anyone had suggestions based on performance etc.

    Mike
    --
    Michael Brown

    The Kingsway Group
    Mike Guest

  3. #3

    Default Re: recommendations for which shell to use

    Bill Campbell <com> wrote in message news:<mi.celestial.com>... 
    >
    > This is probably the OS and/or file system, not the shell.[/ref]

    Yes and no. In UnixWare 7 the "cp" and "cat" commands are
    explicitly doented as supporting >2GB files. The "tail"
    command is not.

    The shell gets into the picture because of file redirection,
    which is under the shell's control.
    If you want to use cat to concatenate two large files, you'd
    like to be able to say

    $ cat largefile1 largefile2 >largefile3

    or append to an existing large file

    $ cat ... >>existinglargefile

    but as the redirection is performed by the shell doing its
    open() calls, they are not large file aware/permitted unless
    the shell is built that way. And as far as I know, the
    UW7 shells are not built as large file aware.

    I am told this matches the expressed intent of the LFS specification.
    Its view was that large files are "precious" and could too easily
    be accidentally trashed if the shell were built as large file capable.
    LFS also had the view that >2GB files would only be binary databases,
    not normal text files, and thus there would be little need for many
    of the text-oriented UNIX commands to work upon them. (For what it's
    worth, SCO disagreed with these LFS decisions at the time.)

    Given all this, you have to be clever to come up with ways of doing
    the usual file operations with large files. You can use dd, for example,
    as the target of the cat's above:

    $ cat largefile1 largefile2 | dd ofile=largefile3

    Jonathan Schilling
    J. Guest

  4. #4

    Default Re: recommendations for which shell to use

    In article <mi.celestial.com>,
    Bill Campbell <com> wrote: [/ref]
     

    Or the program itself. In Olden Dayze the tail on SCO worked only
    on the last 100 lines of the file.

    --
    Bill Vermillion - bv wjv . com
    Bill Guest

  5. #5

    Default Re: recommendations for which shell to use

    On Tue, Nov 18, 2003, Bill Vermillion wrote: [/ref]

    >
    >Or the program itself. In Olden Dayze the tail on SCO worked only
    >on the last 100 lines of the file.[/ref]

    Given that the first thing I've always done on any *ix system we use is to
    install a ton of GNU tools, built with program-prefix=g, I've rarely run
    into these problems. I use the ``g'' prefix so that my fingers
    automatically type ``gfind, gls, etc.'' to be sure that I get the command I
    expect (I even make a bunch of symlinks to these names on Linux systems so
    they work there too).

    Answering the original question, my preference for an interactive shell is
    the Korn shell -- usually pdksh even though I know the real ksh is
    available under open source because I'm used to it.

    Bill
    --
    INTERNET: COM Bill Campbell; Celestial Software LLC
    UUCP: camco!bill PO Box 820; 6641 E. Mercer Way
    FAX: (206) 232-9186 Mercer Island, WA 98040-0820; (206) 236-1676
    URL: http://www.celestial.com/

    Breathe fire, slay dragons, and take chances. Failure is temporary, regret
    is eternal.
    Bill Guest

  6. #6

    Default Re: recommendations for which shell to use

    com (J. L. Schilling) wrote in message news:<google.com>... 
    > >
    > > This is probably the OS and/or file system, not the shell.[/ref]
    >
    > Yes and no. In UnixWare 7 the "cp" and "cat" commands are
    > explicitly doented as supporting >2GB files. The "tail"
    > command is not.
    >
    > The shell gets into the picture because of file redirection,
    > which is under the shell's control.
    > If you want to use cat to concatenate two large files, you'd
    > like to be able to say
    >
    > $ cat largefile1 largefile2 >largefile3
    >
    > or append to an existing large file
    >
    > $ cat ... >>existinglargefile
    >
    > but as the redirection is performed by the shell doing its
    > open() calls, they are not large file aware/permitted unless
    > the shell is built that way. And as far as I know, the
    > UW7 shells are not built as large file aware.
    >
    > I am told this matches the expressed intent of the LFS specification.
    > Its view was that large files are "precious" and could too easily
    > be accidentally trashed if the shell were built as large file capable.
    > LFS also had the view that >2GB files would only be binary databases,
    > not normal text files, and thus there would be little need for many
    > of the text-oriented UNIX commands to work upon them. (For what it's
    > worth, SCO disagreed with these LFS decisions at the time.)
    >
    > Given all this, you have to be clever to come up with ways of doing
    > the usual file operations with large files. You can use dd, for example,
    > as the target of the cat's above:
    >
    > $ cat largefile1 largefile2 | dd ofile=largefile3
    >
    > Jonathan Schilling[/ref]

    Ridiculous.
    Does that mean I can't
    tar cf - /6-gig-tree |bzip2 >1.8-gig-file.tar.bz2
    even though no single file along the way is over 2 gigs, the pipe will
    fail because because the shell will give up during redirecting?

    Actually, I already know that particular example works on open server
    in ksh.
    My guess is that tar is specifically special in some way that allows
    it to pump out data indefinitely, so that it can tar to tape drives?
    (and bzip2's output is less than 2G)
    Brian Guest

  7. #7

    Default Re: recommendations for which shell to use

    In article <mi.celestial.com>,
    Bill Campbell <com> wrote: [/ref]
     [/ref][/ref]
     [/ref]
     

    Ah - that does make a difference.
     

    I tried pdksh - had some problems that I can't remember. I've been
    using only the REAL ksh for years.


    --
    Bill Vermillion - bv wjv . com
    Bill Guest

  8. #8

    Default Re: recommendations for which shell to use

    com (Brian K. White) wrote in message news:<google.com>... 
    > >
    > > Yes and no. In UnixWare 7 the "cp" and "cat" commands are
    > > explicitly doented as supporting >2GB files. The "tail"
    > > command is not.
    > >
    > > The shell gets into the picture because of file redirection,
    > > which is under the shell's control.
    > > If you want to use cat to concatenate two large files, you'd
    > > like to be able to say
    > >
    > > $ cat largefile1 largefile2 >largefile3
    > >
    > > or append to an existing large file
    > >
    > > $ cat ... >>existinglargefile
    > >
    > > but as the redirection is performed by the shell doing its
    > > open() calls, they are not large file aware/permitted unless
    > > the shell is built that way. And as far as I know, the
    > > UW7 shells are not built as large file aware.
    > >
    > > I am told this matches the expressed intent of the LFS specification.
    > > Its view was that large files are "precious" and could too easily
    > > be accidentally trashed if the shell were built as large file capable.
    > > LFS also had the view that >2GB files would only be binary databases,
    > > not normal text files, and thus there would be little need for many
    > > of the text-oriented UNIX commands to work upon them. (For what it's
    > > worth, SCO disagreed with these LFS decisions at the time.)
    > >
    > > Given all this, you have to be clever to come up with ways of doing
    > > the usual file operations with large files. You can use dd, for example,
    > > as the target of the cat's above:
    > >
    > > $ cat largefile1 largefile2 | dd ofile=largefile3
    > >
    > > Jonathan Schilling[/ref]
    >
    > Ridiculous.
    > Does that mean I can't
    > tar cf - /6-gig-tree |bzip2 >1.8-gig-file.tar.bz2
    > even though no single file along the way is over 2 gigs, the pipe will
    > fail because because the shell will give up during redirecting?[/ref]

    No.
     

    Right.
     

    Here's how it was explained to me:

    There's no restrictions on any *pipeline* regarding how much
    data passes through it during its lifetime. There's no
    restrictions on any utilities use of data through pipelines.
    The only restrictions are on access to regular (but large)
    files.

    Given the above example, there's no >2G issues as no single
    large file is accessed (presumably) in the "/6-gig-tree" and
    the resulting compressed tar file is also not too big.

    However, if the bzip2 compression where not enough to reduce the
    produced tar file down to 2G or less, the bzip2 utility (even if
    it was build as large file capable!) would fail when attempting
    to write beyond the 2G boundary, as it's the shell's open of
    "1.8-gig-file.tar.bz2" that determines the large file restriction.

    Moreover, if the above were written (for no particularly good
    reason) as "tar -cf - /6-gig-tree >6+gig.tar; bzip2 6+gig.tar",
    the tar invocation would be guaranteed to fail.

    Jonathan Schilling
    J. Guest

  9. #9

    Default Re: recommendations for which shell to use

    "J. L. Schilling" wrote: 
    > >
    > > Ridiculous.
    > > Does that mean I can't
    > > tar cf - /6-gig-tree |bzip2 >1.8-gig-file.tar.bz2
    > > even though no single file along the way is over 2 gigs, the pipe will
    > > fail because because the shell will give up during redirecting?[/ref]
    >
    > No.

    >
    > Right.

    >
    > Here's how it was explained to me:
    >
    > There's no restrictions on any *pipeline* regarding how much
    > data passes through it during its lifetime. There's no
    > restrictions on any utilities use of data through pipelines.
    > The only restrictions are on access to regular (but large)
    > files.
    >
    > Given the above example, there's no >2G issues as no single
    > large file is accessed (presumably) in the "/6-gig-tree" and
    > the resulting compressed tar file is also not too big.
    >
    > However, if the bzip2 compression where not enough to reduce the
    > produced tar file down to 2G or less, the bzip2 utility (even if
    > it was build as large file capable!) would fail when attempting
    > to write beyond the 2G boundary, as it's the shell's open of
    > "1.8-gig-file.tar.bz2" that determines the large file restriction.
    >
    > Moreover, if the above were written (for no particularly good
    > reason) as "tar -cf - /6-gig-tree >6+gig.tar; bzip2 6+gig.tar",
    > the tar invocation would be guaranteed to fail.
    >
    > Jonathan Schilling[/ref]

    Thanks for the suggestions, I will try using dd to manipulate the
    files. What is happening is a Progress app is dumping csv information
    out, but does it in < 2Gbyte files. The program that reads the
    information wants it in one big file > 2 GB. Progress also adds
    one status line on the end of each intermediate file which needs
    to be stripped.

    Mike

    --
    Michael Brown

    The Kingsway Group
    Mike Guest

Similar Threads

  1. Recommendations ?
    By michael_eisenman@adobeforums.com in forum Adobe Acrobat SDK
    Replies: 16
    Last Post: January 8th, 09:15 PM
  2. Any recommendations on the...
    By timmytotz in forum Macromedia Director 3D
    Replies: 1
    Last Post: August 31st, 07:42 AM
  3. Recommendations?
    By Derek Brinson in forum PERL Beginners
    Replies: 4
    Last Post: December 10th, 11:37 PM
  4. PHP IDE recommendations?
    By Bruce W...1 in forum PHP Development
    Replies: 2
    Last Post: September 16th, 08:40 AM
  5. G5 recommendations??
    By luke in forum Photography
    Replies: 1
    Last Post: August 3rd, 01:25 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139