Soft kill a process to redirect the last kbytes output to a file

Hey guys,

I have a python script that I call with this line:

python mypythonscript.py >> results.csv &

The problem is that the redirection from the stdout to the file results.csv only writes 4096 kbyte blocks.

So if i kill this process with

kill [pidOfTheScript]

the last kbytes that the script produce will be missing.

Is it possible to soft kill a process so that the redirection write the last kbytes that the script produce?

Thanks :slight_smile:

You have to trap the signal and flush the output file descriptor.

I do not know python signal handling, but you should trap SIGTERM. This is the signal kill sends by default. A generic trap like you want calls a function to clean up and complete I/O and close files (which should flush anyway). Then proceeds to exit the process.

Found some info on signal function in python.

https://docs.python.org/2/library/signal.html

1 Like

thank you jim mcnamar,

but the problem is not the python script. it is ok to hard kill the python script. The Problem is the redirection to the file. It looks like as it collects some output of the python script. Only if it collects 4096 kbytes, it will redirect exact this 4096 kbytes.

So the last kbytes are always lost.

It doesn't hurt the script or your system to hard-kill it, no -- but it certainly doesn't do what you want.

Python does single writes every 4096 bytes because this more efficient than doing 50 tinier writes. Python streams do this on the assumption the stream will end naturally, not get hard-killed while waiting for the 4096th byte. This is not a shell thing, a redirection thing, or any kind of system buffer -- the buffer is in Python, part of Python's code, controlled by Python, and must be configured in Python to disable Python's write buffer.

This is not the same as system disk cache, that's transparent, you'd see a correct result if the system knew it was supposed to be there.

This is not the same as the buffers used for pipes. Those don't apply when not using pipes.

You cannot force Python or any other program to not buffer from the outside, unless there's some mysterious NEVER_BUFFER environment variable Python responds to or something.

1 Like

ohhh... okay i get it. thank you very much :slight_smile:

I found a other solution:

stdbuf -oL

But you are both right! I should change it in python. Thank you very much

There is a mysterious variable Python responds to, actually!

export PYTHONUNBUFFERED=1

./myscript >> file &

I don't think stdbuf is applicable unless you're writing to a pipe or TTY. Block or line buffering is not a system feature for file streams, Python must be doing that by itself.

1 Like

this works well. Thanks Corona688 :slight_smile: