I stumbled across a somewhat strange behavior of tar and find no explanation for it: i was testing a DVD for read errors and thought to simply tar the content and direct the output to /dev/null:
tar -cvf - /my/mountpoint/*ts > /dev/null
This way i expected the system to read the complete content and eventually to report a read error if there is one. To my amazement the command came back immediately without any error. Still i didn't believe it had read anything (save for maybe the directory information) so i directed the output to a real file instead.
tar -cvf - /my/mountpoint/*ts > /tmp/somefile
This time tar did read the files and subsequently reported the read error i was expecting in one file.
My question is, why didn't it read the files before? Is there something i am missing?
First off, I'd like to state that i do NOT have a problem. I just want to achieve a better understanding. So you don't need to tell me workarounds, as i have already found one.
This is exactly what i was expecting tar to do: read all the files.
Sorry, but this is simply not true: the dash as a filename is signifying <stdout> and if i use a redirection or a pipeline to manipulate tars output further should make no difference.
In fact (as i have stated in my first post) the command worked the way i wrote it when i used a real file (instead of /dev/null) as output destination. It is this inconsistency - the command reading all the files when a real file is the destination and not working, when the destination is /dev/null - i want to understand.
Does your OS have the equivalent of truss on Solaris. When a program is run with truss system calls are output to STDERR. If you have something like that running tar with that and then redirecting the output to /dev/null and a regular file might show you why it ends so soon.