Mysql question: Best way to update a column containing 8 million rows

Hi all,

I was wondering if anyone knew a good/safe way to update a single column in a table that could contain upto 8 million rows...

simple command like:

UPDATE set blah=foo where bar=XXX;

I will be running this on tables being written to and tables that have already been created.

---------- Post updated at 03:38 PM ---------- Previous update was at 06:54 AM ----------

no one have a way?

the only one i can think of is limiting the UPDATE and do it in batches?

Hi.

Sorry, I'm not so familiar with mySQL, but as you didn't get an answer yet!

This is based on Oracle, but the same ideas should relate.

Firstly it's safe to update a table with however many rows. The worst that could happen is that the update fails and the uncommitted changes are rolled back.

You could:

  • split the updates in to smaller batches
  • add commit statements every few hundred / thousand statements
  • instruct your sql client to commit every few hundred / thousand statements
  • add more redo / undo logs or make your existing ones larger, or both
  • switch off logging on the table (alter table .. nologging if mySQL supports that) to reduce redo / undo (in most cases)
  • disable referential / unique constraint indexes (more for performance)

:b: Many thanks Scottn

I think I am going down the road of creating a script to run in the background (as a cron task) and commit every X thousand records. I just didn't want to run the update all in one go as this would impact performance and increase logs made.

Mysql and a billion rows using innodb
read more
[b]Mysql and a billion rows using innodb - dslreports.com