removing the duplicate lines in a file

Hi,

I need to concatenate three files in to one destination file.In this if some duplicate data occurs it should be deleted.
eg:
file1:

data1 value1
data2 value2
data3 value3

file2:

data1 value1
data4 value4
data5 value5

file3:

data1 value1
data4 value4
data3 value3
data6 value6

the result file:

data1 value1
data2 value2
data3 value3
data4 value4
data5 value5
data6 value6

How to achieve this?

try:

cat file1 file2 file3 | sort | uniq

No need for sort | uniq because sort -u will do it in one step.

sort -u file*

or

sort -u file1 file2 file3