Duplicate Keys

I am trying to insert csv data into a table

mysql> load data infile '/var/www/PLU.csv' into table Food2 fields terminated by ',' enclosed by '"' lines terminated by '\n' ;
ERROR 1062 (23000): Duplicate entry '4014' for key 'PRIMARY'

+-------------+-------------+------+-----+---------+-------+
| Field       | Type        | Null | Key | Default | Extra |
+-------------+-------------+------+-----+---------+-------+
| PLU         | char(6)     | NO   | PRI | NULL    |       |
| Item        | varchar(30) | YES  |     | NULL    |       |
| Type        | varchar(30) | YES  |     | NULL    |       |
| Size        | varchar(10) | YES  |     | NULL    |       |
| InStock     | int(5)      | YES  |     | NULL    |       |
| OrderPoint  | tinyint(5)  | YES  |     | NULL    |       |
| OPFlag      | tinyint(1)  | YES  |     | NULL    |       |
| StockWanted | int(3)      | YES  |     | NULL    |       |
| Price       | float       | YES  |     | NULL    |       |
| WeightFlag  | tinyint(1)  | YES  |     | NULL    |       |
+-------------+-------------+------+-----+---------+-------+

Here is a sample of the csv data..

3615,"APPLES","Civni",,,,,,,
3630,"APPLES","Co-op 43",,,,,,,
4104,"APPLES","Cortland","Small",10,5,0,15,0.25,1
4106,"APPLES","Cortland","Large",12,5,0,15,0.30,1
4105,"APPLES","Cox Orange Pippin",,,,,,,
4107,"APPLES","Crab",,15,7,0,25,0.15,0

The troublesome primary key has been removed, the table dropped and recreated, the csv file has been recreated, all with no success.

I did not have this problem. This implies something is in the table.
I created a testcase of your scenario:

Create these 3 files

file1: testcase (created DB, table and inserts data into file)

#testcase
set -x
DB="ZYX"
DIR="/var/lib/mysql/${DB}"
FILE="${DIR}/x.csv"

mysql <<EOD
create database ${DB};
use ${DB};
CREATE TABLE Food2 (
PLU CHAR(6) ,
Item varchar(30) ,
Type         varchar(30) ,
Size         varchar(10) ,
InStock      int(5)     ,
OrderPoint   tinyint(5),
OPFlag       tinyint(1),
StockWanted  int(3)   ,
Price        float   ,
WeightFlag   tinyint(1) ,
 PRIMARY KEY (PLU)
);
EOD
cat <<EOD2 > ${FILE} 
3615,"APPLES","Civni",,,,,,,
3630,"APPLES","Co-op 43",,,,,,,
4104,"APPLES","Cortland","Small",10,5,0,15,0.25,1
4106,"APPLES","Cortland","Large",12,5,0,15,0.30,1
4105,"APPLES","Cox Orange Pippin",,,,,,,
4107,"APPLES","Crab",,15,7,0,25,0.15,0
EOD2
mysql << EOD3
use ${DB}
load data infile '${FILE}' into table Food2 fields terminated by ',' enclosed by '"' lines terminated by '\n' ;
EOD3

File 2: testcase.show #shows data in table

#testcase.show
set -x
DB="ZYX"
DIR="/var/lib/mysql/${DB}"
FILE="${DIR}/x.csv"

mysql <<EOD
use ${DB};
select * from Food2;
EOD

testcase.cleanup #removes DB and table and allows you to retest

#testcase.cleanupset -x
DB="ZYX"
DIR="/var/lib/mysql/${DB}"
FILE="${DIR}/x.csv"

rm ${FILE}
mysql <<EOD
use ${DB};
drop table Food2;
drop database ${DB};
EOD

To test

chmod +x testcase*
./testcase
./testcase.show
#try re-importing the data, it will fail
./testcase.cleanup

I was able to insert this csv file with no problems.
I could not recreate your error.
You state that you dropped the table, but you don't show the databases or the tables in your mysql subsystem. This testcase insures there is nothing in the table.

Double-check your csv file for this record, eg.

grep ^4014, file.csv

Thanks to all.

I took a brute force approach. [as per Dr. Google]

Export to LibreOffice calc.

MySQL Export Table to CSV

SELECT orderNumber, status, orderDate, requiredDate, comments 
FROM orders
INTO OUTFILE 'C:/tmp/cancelled_orders.csv'
FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"'
LINES TERMINATED BY '\r\n';

Modify above code as needed.

Import into LibreOffice Calc.

Modify as needed to remove the duplicates.

Truncate the table w/ the duplicates.

Import the csv file.

Done

BTW Is there a spell checker in any of the reply boxes?

---------- Post updated at 09:28 AM ---------- Previous update was at 12:17 AM ----------

To all -

My apologies.

I was not trying to test anyone with my post. Your answers are excellent! I was totally confused prior to my post and also felt lost. Suddenly it hit me why not try the method that I mentioned. I have approximately 1500 lines in my spreadsheet. What I suggested would not work with much larger spreadsheets.

Again, sorry.

The code works well.