I tried the approach in this example first, but had not yet succeeded when I found the next example that is using PHP to *backup the db.
Create the backup file
Here is the script:
foreach($tables as $table)
{
$result = mysql_query('SELECT * FROM '.$table);
$num_fields = mysql_num_fields($result);
$return.= 'DROP TABLE IF EXISTS '.$table.';';
$row2 = mysql_fetch_row(mysql_query('SHOW CREATE TABLE '.$table));
$return.= "\n\n".$row2[1].";\n\n";
for ($i = 0; $i < $num_fields; $i++)
{
while($row = mysql_fetch_row($result))
{
$return.= 'INSERT INTO '.$table.' VALUES(';
for($j=0; $j<$num_fields; $j++)
{
$row[$j] = addslashes($row[$j]);
$row[$j] = ereg_replace("\n","\\n",$row[$j]);
if (isset($row[$j])) { $return.= '"'.$row[$j].'"' ; } else { $return.= '""'; }
if ($j<($num_fields-1)) { $return.= ','; }
}
$return.= ");\n";
}
}
$return.="\n\n\n";
}
//save file
$handle = fopen('c:\automatedDBBackups\db-backup-'.date("Y-m-d-H-i-s").'-'.(md5(implode(',',$tables))).'.sql','w+');
fwrite($handle,$return);
fclose($handle);
I copied the script into my php file that i was using, tweaked it to store the .sql file in a different location with a different date/time stamp and the script worked the first time I tried it. Not bad script author.
I opened the .sql file that resulted from executing the script, using Notepad ++, and it looked good, what I expected.
Restore the backup file (.sql file)
I went into phpMyAdmin, selected by db, clicked on Import, browsed to my new .sql file and clicked the Go button.
The .sql backup file was too large to upload to the server. I had to edit the php.ini file again and update two variables, setting each to 25mb. This is an excerpt from the very helpful documentation from phpMyAdmin - specifically this file if you are using phpMyAdmin phpmyadmin/Documentation.html#faq1_16
"1.16 I cannot upload big dump files (memory, HTTP or timeout problems).
Starting with version 2.7.0, the import engine has been re–written and these problems should not occur. If possible, upgrade your phpMyAdmin to the latest version to take advantage of the new import features.
The first things to check (or ask your host provider to check) are the values of upload_max_filesize, memory_limit and post_max_size in the php.ini configuration file. All of these three settings limit the maximum size of data that can be submitted and handled by PHP. One user also said that post_max_size and memory_limit need to be larger than upload_max_filesize.
"I updated upload_max_filesize, and post_max_size but not memory_limit.
After updating and restarting the Apache service, I tried to Import the .sql file again and had success. The Import took about 60 seconds, and its a small file, about 9 MB - small as DB sizes go.
What I will likely do in the future is restore part of the script to the DB. If a table or multiple tables become corrupted or hacked, I can pull from the .sql file what I need, since all of the moodle tables are created and the data inserted into each one.
In the event of a larger scale failure of the DB, I could restore the entire file, but I will probably have to update those two php variables again, as the size of the DB will be much larger than 25 MB.
After importing the script, I checked the site and a couple of the courses to ensure the data come in ok and everything looks in order.
Trying to call my new backup.php script from a cron service.
Now I want to automate this by calling the php file from the cron script. I am going to try that now and see if I can get the script to be called at midnight. Actually, I am trying a Windows scheduled task first. I followed this example, and created a task that calls FireFox, which loads my php page. I will check in the am to see if I have a .sql file created at 1 am.
No comments:
Post a Comment