Announcement

Collapse
No announcement yet.

script to automate define limit of a file

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    script to automate define limit of a file

    Dear friends

    Is it possible to make a file of specified number of line I mean i want to careate a log file which should not exceed more then 1000 line then how can i define the limit

    like say
    " my_script_generating_log.sh >> log.txt "

    and second

    is it possible that if 1001 line need to be recorded to log.txt file then automatically first line get deleted and so on. if 1002 line come 2nd should be delete means total no of lines should be 1000 and whenever new line comes to the log.txt oldest line get deleted automatically.. or any other similar IDEA which can resolve this target. the target is that i want all latest 1000 logs only which was generated by my script basically it is urlsnarf package which record the brawsing history of users on eth0. I want only latest 1000 logs.

    urlsnarf -i eth0 >> log.txt


    third
    please give me any idea how can i view, cat a file which was created by mkfifo command


    thanks

    #2
    You might want to look at
    Code:
    man logrotate.conf
    Otherwise something like:

    Code:
    #!/bin/bash
    
    
    LOGFILE=somefile.log
    LOGLIMIT=1000
    
    
    function log() {
        echo "$1" >> $LOGFILE
        log_overflow=$(( $(cat $LOGFILE | wc -l) - $LOGLIMIT + 1))
        if [[ "$log_overflow" > 0 ]] ; then
            newlog=$(tail -n +$log_overflow $LOGFILE)
            echo "$newlog" > $LOGFILE
        fi
    }
    
    
    log $(date +%M:%S)
    Last edited by james147; Feb 19, 2013, 05:48 AM.

    Comment


      #3
      Originally posted by farjibaba View Post
      is it possible that if 1001 line need to be recorded to log.txt file then automatically first line get deleted and so on. if 1002 line come 2nd should be delete means total no of lines should be 1000 and whenever new line comes to the log.txt oldest line get deleted automatically.
      You can't do that easily with ordinary *nix text files, it implies rewriting the file on every line. (In the days when operating systems proliferated like languages do, some OS's had file types that could do it I'm sure.)
      Close is the log rotation that james147 has suggested, say 10 files of 100 lines each, plus the one currently being written. So sometimes you've got 1000 lines of log, sometimes 1099.
      If you really want to have exactly 1000 lines, then you could do it with a database, say MySQL, with a trigger that deletes the oldest row when a new one is entered.
      But if you were really serious about this stuff, you'd pick your favourite language then use that language's source of components and get a logging framework from it. I really like log4perl and log4j, so I suppose for bash log4sh would be good. A lot to learn, though, but will go as far as you ever might need.

      Regards, John Little
      Regards, John Little

      Comment


        #4
        [QUOTE=james147;322711]You might want to look at
        Code:
        man logrotate.conf
        Thanks for this it looking fine for me i study it on google
        please tell if i use rotate 1 and daily what does it mean? and when the system will done rotating what time ? I am confused about it ?
        and I am using oldir /mydir/ and i make entries in /etc/logrotate.conf
        it is giving backup to my dir when i am using logrotate -f /etc/logrotate.conf
        but only upto 10 files can it be extended to 15.if yes then how ?
        please

        Comment

        Working...
        X