Dumping database results and doing checksums

hey I am dumping two database results into two text files and doing checksums on them.

Anyone know how to do this.

I originally was doing line by line comparison and diff but I need something faster for larger gb size files. anyone got tips?

How fast is your disk? How large is your memory? You could run md5sum on it for checksums and see if they match. When operating on a gb file, it's not going to happen in a split second no matter what.

Checksum is generally not used in this fashion. Using checksum to compare files from two different data sources is questionable. A checksum is usually used to insure that a single file or copy of that file is the same or has not been modified. Doing a checksum on linux is as simple as....

sum <file>

I didn't see the complete message regarding the large file.

2 Likes

Good point, a checksum does character for character matching, a single irrelevant space would upset it.

That sounds kind of expensive. Standard SQL should be able to tell you something.

SELECT * FROM ( 
SELECT 'tablea' from_tab, * FROM tablea minus SELECT * FROM tableb@otherdb
union all
SELECT 'tableb' from_tab, * FROM tableb@otherdb minus SELECT * FROM tablea );

sum seems to be doing the trick. can someone comment on what it does im still researching .

how is it that if i go into a large file and change one letter it picks up that change? what is behind that?