It seems to work but after a period of 5-8 hours the splited text files stop to increase their size and content, and I can't get more data till I kill the processes. When the adquisitions stops I get no error on screen and the processes appear if I call ps.
With BSD netcat -k option will listen forever, with GNU i do not think it works.
Both should be available on linux distributions
On debian :
~$ apt-cache search netcat-
netcat-traditional - TCP/IP swiss army knife
netcat-openbsd - TCP/IP swiss army knife # this is the one you want with -k option.
Thank Chubler_XL i will try and tell you.
Yesterday i tried a simpler:
nc -ul -p 56045 -q -1 -vv >data.txt
and there was no problem, so it seems that it has to be the awk or split sentence.
Thanks Peasant, I'm using GNU but I guess that the nc -q -1 option should work and wait forever and the tried that I did yesterday seems to confirm this. At this moment I guess that the problem is in the awk or split sentence
I don't think you've said what operating system or filesystem type this is. Is there a possibility that you're hitting a maximum file size for the filesystem? When it stops writing to the file, is the file near or about the same size each time?
I can say that it seems that the split command is stopping the whole sentence because there some data that arrives to pre-awk.data and not to pre-split.data
hicksd8 I'm working on GNU/Linux 9.4 and the file stream could be infinite, I'm dividing it with split in 1M files and curiously the process always ends between 3-4 files (3.5-4M). I dont think its a problem with the file size but maybe a buffer size or buffer waiting as Corona668 is pointing.
Any clues on how to trace this? I'm not having any error message
and then execute ./AISdecode.sh &
should I have any problem when the ssh session closes?
Note that in this case I've added and fflush(); to the awk sentence, eliminated the split sentence and reidirect stderror and stdout to a file just to tests by parts and see if I can find the problem. I'm my last try without 2>&1 & it stopped with a file of 3.8 M, now I'm trying this, lets see.
[LEFT]Run a screen session, detach, return to it later (attach).
There are, of course, other alternatives and utilities, but i find screen mature, so i recommend it.[/LEFT]
Is it a real pain to try out BSD nc variant to check if you are having same symptoms ?
To really detected what is happening one would need to tcpdump (or equivalent) and examine the generated files after, looking for suspected behavior.
I see no other way of telling what is going on, if network level is the problem of some sort.
When did the problem surfaced ?
Did it stop working in some point after working for some time or ?
Would it make sense to start from the other side and flood the network and reception pipe from the sender side, by e.g. cat ting over MBytes and MBytes?
What do you mean by BSD nc variant?
The problem is a new issue, I mean is a new problem, I recently obtained this data stream and I want to save it continuously but at the moment I can't get a satisfactory result, from time to time the pipe stop receiving data and saving to the file.
I guess that from time to time some packect get lost or the transmission temporarely stops or the pipe buffer if filled so even when the data still being received the data storing get stuck.
-I've realized that the script only stops and 5:05 and 17:05 so I thought that maybe was a cron job in the server but I could not fin anything. I tried the same in my local area network and the same happens so maybe the data streams is doing something weird and that times.
-I use vim to take a look to the files when this happens, to see if theres any special mark. I found that at the end of the file there are several trailing characters:
What you need to consider, if you do not like tail , is to develop some kind of daemon - usually written in C. You have an attached child process that leaves the parent there forever. If I understand what you have there.
PS: fflush() on every line forces disk I/O instead of buffered I/O. Since nc can produce an endless stream of data this change might help.
Can you please try the bsd version
It should take little time and effort.
This was a well intent suggestion before, since i had issues with gnu nc while experimenting with go and stuff.
Similar issues as you describe here.
I did not debug further.
Only switched version which resulted in expected behavior..