I tried to output the result of the query directly into a csv file, but there are no separated columns with entries. The excel file is displaying all the result as a single whole. I want the query output to have different columns in excel. How to do that?
Yeah it has db2 installed and it creates a single space as field separator.
---------- Post updated at 07:50 AM ---------- Previous update was at 07:49 AM ----------
Hi, thanks for the explanation. However i dont want to import text into excel as it an additional manual task. I want to create an excel file directly somehow using any function that DB2 uses.
@Rudic : I dont have Colsep in db2. What else can i use here.
You can also add comma delimiters between columns and write to csv file. Then open csv file in Excel. Note you will need to choose another delimiter if the data could have embedded commas. Here is an example:
SELECT
COLUMN1, CHAR(','),
COLUMN2, CHAR(','),
COLUMN3, CHAR(','),
COLUMN4
FROM TABLE
Why not a python or perl script using one of John Mcnamara's modules. Google search python csv to xlsx mcnamara turns up his GIT repository as the top result.
Have look a this thread where I suggest an awk script in combination with a template file to produce an "XML Spreadsheet 2003" from CSV file(s). Excel can open these XML files directly as if they were spreadsheets.
Would this sort of approach be acceptable?
If so we could tailor this approach for your particular problem.
Thanks this is almost perfect, but it is causing the Column header row to appear as Column1 2 Column 4 Column 3 6 and all these are not getting separated into different columns. How do i correct this.
You may use Data::Table::Excel, a Perl module, to automate the conversion process. The following example show how to use:
#!/usr/bin/env perl
use strict;
use warnings;
use Text::CSV;
use Data::Table::Excel qw(tables2xls);
my $csv = Text::CSV->new({binary => 1, eol => "\n"}) or die "$!\n";
my @table_names;
for my $file (@ARGV) {
my $csv_file = $file;
$csv_file =~ s/\.txt/.csv/;
push @table_names, $csv_file;
open my $FH_TXT, "<", $file or die "$!\n";
open my $FH_CSV, ">", $csv_file or die "$!\n";
while (my $line=<$FH_TXT>) {
my @cols = split /\s+/, $line;
$csv->print($FH_CSV, \@cols);
}
close $FH_TXT or die "$!\n";
close $FH_CSV or die "$!\n";
}
my $xls_file = "output.xls";
my @tables;
for my $name (@table_names) {
my $table = Data::Table::fromFile($name);
push @tables, $table;
}
my @portraits = map { 1 } @tables;
my @column_headers = map { 0 } @tables;
tables2xls($xls_file, \@tables, undef, undef, \@portraits, \@column_headers);
Save the file to convert.pl or other name. To use the script, remove the long dash line in txt; then do this:
$ convert.pl record01.txt record02.txt ...
All txt file will be saved to one xls file with each txt file in a seperated sheet.
Use sed to convert spaces to commas and rename it csv. excel will open it right up.
Mike
---------- Post updated at 07:14 PM ---------- Previous update was at 07:11 PM ----------
You can also script the entire Excel importation into a excel macro. I did this some years ago to get a nicely formated print out on the printer waiting for me every morning on the way to a meeting.