1

I have a process that outputs a line for every progress update (sidenote: it does clear/replace the line, no pure newline break).

I want to save the latest line of that process to an output file or truncate the output file to keep the size manageable.

At the moment I have genrtr > genrtr.log and with a cron I tried to use > genrtr.log but it doesn't work. Also rm genrtr.log doesn't help because then the process stops updating the file.

I understand why those don't work, but wonder how to restructure it so it fits my needs.

Tried genrtr | sed -ne '$w genrtr.log' but then it waits for the process to end before writing to the file.

Clarifications: The process produces output every 1 second and unless the server crashes the process will keep running for ever.

Sev
  • 140

4 Answers4

0

Solution

At the moment I have genrtr > genrtr.log and with a cron I tried to use > genrtr.log but it doesn't work.

This approach can easily be fixed by using genrtr >> … instead of genrtr > … (while still using > in cron to truncate the file).

The difference is explained in this another answer of mine which can be summarized by the following statement:

>> is essentially "always seek to end of file" while > maintains a pointer to the last written location.

Read the linked answer, it totally matches your case.


Side note

Also rm genrtr.log doesn't help because then the process stops updating the file.

Strictly it does not stop updating the file. The file gets unlinked from the directory, but it's still open, still written to, it still consumes more and more space in the filesystem.

Any new file is a different file even if it takes the same pathname. The process does not update the new file because it never opens it, it doesn't even notice it. The file descriptor used by the process still leads to the old (deleted) file.

0

i was wondering whats the frequency of the run ? ....

try tee to redirect output to the logfile , its a utility often used to make a copy of the strem and redirect that copy into the output file

use : 
  command | tee command_result.log 

you could have a monitor function in a wrapper script that invokes this program which will delete the top 10 lines after some intreval ....... this way your log file wont get more than a few KB

also if there are useless spaces in between you could use translate "tr" utility to squeeze ... example :

Nitin@Kaizen ~
$ df -h | head -1
    Filesystem      Size  Used Avail Use% Mounted on

Nitin@Kaizen ~
$ df -h | head -1 | tr -s ' '
    Filesystem Size Used Avail Use% Mounted on   *** note the squeeze in space 

hope this helps

Nitin4873
  • 189
  • 1
  • 1
  • 7
0

There are a few things you could try. The easiest is ti just print the last lines of the file:

tail genrtr.log

Then, once the process finished delete the log file. Another option, would be to periodically overwrite the file.

  • Launch the process in the background:

    genrtr > genrtr.log &
    
  • Overwrite the contents of the logfile:

    echo > genrtr.log
    

The file is now truncated but will continue to be updated by gentr so the next update report will be written to it. You could automate this, for example, truncating the file if it gets larger than 1MB:

while true; 
  do if [ $(stat -c%s genrtr.log) -gt 1000000 ]; then 
    tail genrt.log > /tmp/foo && cat /tmp/foo > genrt.log; 
     fi;
done

That little scriptlet will run until you stop it (while true;), and every time that genrtr.log is larger than one MB, it will keep the last few lines and delete the rest of the file.


UPDATE:

As Scott very correctly pointed out below, if your output contains \r to clear the line, tail will not work as expected. This, however, should:

while true; 
  do if [ $(stat -c%s genrtr.log) -gt 1000000 ]; then 
    tail genrt.log | perl -pe 's/.+\r(.+)/$1\n/' > /tmp/foo && cat /tmp/foo > genrt.log; 
     fi;
done

The perl command deletes everything before the last \r and prints the last "line" (the data after the \r and a newline. The result should be that the list line is kept, the rest of the file is cleared and the file continues to be populated.

terdon
  • 54,564
0

I believe that this is going to be very tricky to do without either modifying genrtrc or writing a new program.  If you’re more comfortable doing the latter, I suggest this outline:

int   c;
FILE  *fp;

fp = fopen(log_file, "w");
if (fp == NULL) (Handle errors)
while ((c = getc()) != EOF)
{
    putchar(c);
    if (c ==escape sequence that ends a line)
    {
        fflush(fp);     //You should probably check for errors here, too.
        rewind(fp);
    }
    else
        putc(c, fp);
}

This acts like a combination of tee and tail -- reading standard input, and writing it to standard output and a file -- with the difference that it keeps only the last line in the file.  Then you would run

genrtr |the_above_program