Iconv on large files

Hi All,

I am using iconv to convert huge files. the process is getting killed. I tried the option in below link

i.e

iconv -f UCS-2 -t UTF-8 < inputfile.txt > outputfile.txt

However, the process still gets killed. We do not have iconv-chunk.

Can you please suggest anyway to process without explicitly splitting the inputfile by myself in the script.

My crystal ball is a little bit cloudy this afternoon. Exactly what happens?

  1. What diagnostics are produced?
  2. What is the size of inputfile.txt ?
  3. When it is killed, what is the size of outputfile.txt ?
  4. How much space is available in the filesystem where you are creating outputfile.txt ?
  5. What operating system are you using (including the release number)?

Please find my response below

I just get "Killed" message

Size is approx 60 GB

It is getting killed approx at 50GB

There is around 500 GB on the filesystem

Linux 2.6.32-642.3.1.e16.x86_64

Could you follow free memory and swap space during conversion and prior to the kill?

I wonder if its not creating a temp file during conversion... what about the free space in whatever tmp filesystem ( /tmp or /var(/tmp)...)