How to convert Text data to xls?

Hi Team,

I have created a script to output DB Query data to a text file. It displays output as follows for 2 different queries appended to same file.

I want help to convert this data to xls format so that the output of first query is in 1 tab and the other in second tab. Please help.

BATCHID     STARTDATE                  FINDATE                   
----------- -------------------------- --------------------------
    2000001 2014-08-06-02.37.35.786000 -                         
    2000002 2014-08-06-02.37.07.833000 -                         
    2000003 2014-08-06-02.37.02.705000 -                         
    2000004 2014-08-06-02.36.53.644000 -                         
    
    4 record(s) selected.
    
1          2           3          
---------- ----------- -----------
07/30/2014           3          33
07/30/2014           3          28
07/30/2014           3          21
07/30/2014           3          24
 
4 record(s) selected.

You need EXCEL (or equivalent) for that. Create two independent .csv (comma separated) files from your queries and read them into EXCEL.

I tried to output the result of the query directly into a csv file, but there are no separated columns with entries. The excel file is displaying all the result as a single whole. I want the query output to have different columns in excel. How to do that?

Oracle provides SET COLSEP ',' . Maybe your system has sth. similar? Try ',' or ';'.

1 Like

It appears that your script creates a text file with a single space as the field separator.

I do this all the time. I cut this data from your post:

BATCHID STARTDATE FINDATE 
----------- -------------------------- --------------------------
2000001 2014-08-06-02.37.35.786000 - 
2000002 2014-08-06-02.37.07.833000 - 
2000003 2014-08-06-02.37.02.705000 - 
2000004 2014-08-06-02.36.53.644000 - 

and created a file in notepad called data.txt. Then I used this procedure:

open excel 2007

click data on the ribbon

click from text-- browse to data.txt and open it.

Pick "delimited" click next

unckeck "tab" and check "space" then click next

click "finish"

accept "=$A$1" default and click ok -- data will populate the spreadsheet

Yeah it has db2 installed and it creates a single space as field separator.

---------- Post updated at 07:50 AM ---------- Previous update was at 07:49 AM ----------

Hi, thanks for the explanation. However i dont want to import text into excel as it an additional manual task. I want to create an excel file directly somehow using any function that DB2 uses.

@Rudic : I dont have Colsep in db2. What else can i use here.

IMHO only excel can create excel files... a xls contains more that just data... It has info on how the spreadsheet is structured etc...

You can also add comma delimiters between columns and write to csv file. Then open csv file in Excel. Note you will need to choose another delimiter if the data could have embedded commas. Here is an example:

SELECT  
COLUMN1, CHAR(','),
COLUMN2, CHAR(','),
COLUMN3, CHAR(','),
COLUMN4
FROM   TABLE
1 Like

Why not a python or perl script using one of John Mcnamara's modules. Google search python csv to xlsx mcnamara turns up his GIT repository as the top result.

Have look a this thread where I suggest an awk script in combination with a template file to produce an "XML Spreadsheet 2003" from CSV file(s). Excel can open these XML files directly as if they were spreadsheets.

Would this sort of approach be acceptable?

If so we could tailor this approach for your particular problem.

1 Like

Thanks this is almost perfect, but it is causing the Column header row to appear as Column1 2 Column 4 Column 3 6 and all these are not getting separated into different columns. How do i correct this.

You can try something like this to create a formated header record:

SELECT 
   'COLUMN1,' 
   'COLUMN2,' 
   'COLUMN3,' 
   'COLUMN4' 
FROM 
   SYSIBM.SYSDUMMY1
UNION
SELECT  
   COLUMN1, CHAR(','),
   COLUMN2, CHAR(','),
   COLUMN3, CHAR(','),
   COLUMN4
FROM
   TABLE

You may use Data::Table::Excel, a Perl module, to automate the conversion process. The following example show how to use:

#!/usr/bin/env perl

use strict;
use warnings;

use Text::CSV;
use Data::Table::Excel qw(tables2xls);

my $csv = Text::CSV->new({binary => 1, eol => "\n"}) or die "$!\n";

my @table_names;
for my $file (@ARGV) {
    my $csv_file = $file;
    $csv_file =~ s/\.txt/.csv/;
    push @table_names, $csv_file;

    open my $FH_TXT, "<", $file or die "$!\n";
    open my $FH_CSV, ">", $csv_file or die "$!\n";

    while (my $line=<$FH_TXT>) {
    my @cols = split /\s+/, $line;
    $csv->print($FH_CSV, \@cols);
    }

    close $FH_TXT or die "$!\n";
    close $FH_CSV or die "$!\n";
}

my $xls_file = "output.xls";

my @tables;
for my $name (@table_names) {
    my $table = Data::Table::fromFile($name);
    push @tables, $table;
}

my @portraits = map { 1 } @tables;
my @column_headers = map { 0 } @tables;
tables2xls($xls_file, \@tables, undef, undef, \@portraits, \@column_headers);

Save the file to convert.pl or other name. To use the script, remove the long dash line in txt; then do this:

$ convert.pl record01.txt record02.txt ...

All txt file will be saved to one xls file with each txt file in a seperated sheet.

Use sed to convert spaces to commas and rename it csv. excel will open it right up.

Mike

---------- Post updated at 07:14 PM ---------- Previous update was at 07:11 PM ----------

You can also script the entire Excel importation into a excel macro. I did this some years ago to get a nicely formated print out on the printer waiting for me every morning on the way to a meeting.

Mike