Monday, January 7, 2013

Chunking up the big .sql file

I had a scare today, I thought our moodle DB had been hacked, at least part of it.  I found myself back in a familiar mode, trying to figure out how to best restore my big honking .sql file that contains all my moodle db information.  I use the mysqldump command to create backup files 3 times a week.  I store the backup files on the F drive of my server.

Since I am using Wamp, I have phpMyAdmin as a tool for most of my DB operations.  After loading phpMyAdmin, I choose my test db, which is an old replica of my production DB, it is called student_copy.  I make sure I DO NOT FORGET TO CHOOSE THIS DB, in case I blow something up.











Next, I Browse to the F drive on the server where the .sql file is located and click the Go button





















If only it were that simple.  despite my adjusting of the size setting (not 1,024 MB) file upload size, the .sql file is too large to import at once.  I did get this to work when the file size was less than 100 MB, but at 240 MB, it is not loading.  I may be able to get the file to upload at once using the command line, but that is tricky too.  If I could get the GUI to work, that would be preferable. 

After looking around a bit on stackoverflow and dbexchange, there were a couple mentions of chuncking up the large .sql file.  I did have success doing this by opening the .sql file in Notepad ++ and then simply copy/paste into the SQL pane of phpMyAdmin. 
















I can copy/paste directly into this box from the .sql file...














This is not the most efficient method, but it does allow me to pick and choose .SQL commands from the file and load them into phpMyAdmin for execution.

The steps then to chunk up the file for execution:

1 - open phpMyAdmin
2 - open the .sql file (using Notepad ++)
3 - copy the commands for the tables to update from the .sql file
4 - paste the commands into the SQL pane in phpMyAdmin (don't try too many at once....pay attention to the amount of data in the table....)
5  - repeat - lather - rinse...

In the meantime, I need to figure out how to do this from the command line, like I did the mysqldump command that allowed me to automate the creation of the backup files.

This post looks like it contains the right command, I just need to be sure I am getting the server, dbname ect correct - there is NO CHANCE of me trying this at the command line if there is even a remote change I may blow up the production DB, which is on the same machine as the copy DB.

Using a Dump File Splitter - like a knuckleball

This guy writes a scenario much like mine.....I have to test his solution!  Can I really run from the command line that easily?  “mysql -uusername -ppassword < dumpfile.sql” - I need to know exactly what is being replaced before I run this....how to I specify the DB to run this on?  I guess I could add USE DB name to the top of the script? And this post too is informative.




No comments:

Post a Comment