7. Restore MySql backup

Examples about how to restore data from the backup. When backup run it creates restore.php file within backup folder. You need to run it in the command line like this:

php -f restore.php -- [<parameters>] [<backup folder>]

Parameters:

-u, --user mysql user name under which we will perform restore
-p, --password mysql user password
-h, --host mysql server host name
-P, --port mysql server port number
-S, --socket mysql server port number
-D, --database target database name
--drop-db - will drop DB if exists
--no-data - skip data import
--force-local-server - force data load to use method optimal for local server
-a, --actions restore actions to execute (default is u,f,t,i,d,r,v,p,tr,g):

u - users
f - functions
t - structure of tables
i - indexes
d - table data
r - references
v - views
p - procedures
tr- triggers
g - permission grants

--create-index - (before|after) data load
-F, --filter-ext - external command which returns 1 if object action should be processes
-C, --clone-to - provide folder where you want to copy backup data, filter will be applied, if value ends with zip data will be compressed
-do, --decompress-only - decompress data files only
-df, --decompress-folder - if data have to be uncompressed it will happen into data folder, you may change this with this option
-da, --decompress-action - if data had to be decompressed on import this will happen after import completes:

delete - delete decompressed
keep - keep decompressed and compressed
replace - keep decompressed and delete compressed

-f, --force - will not prompt user to approve restore
-q, --quite - will not print messages
--log-process - print messages describing restore process (0=off, 1=on)
--log-sql-warn - print MySQL server warning messagesv (0=off, 1=on)
--log-sql-exec - print executed SQL statements (0=off, 1=all, 2=if SQL warning found)
-?, --help - display instruction how to use cli.php

Quite nice functionality mainly useful for developers is a option to apply filters on data you want importing. That way you don't need to import huge log tables that you don't really need on your local machine.

The way you would go about it is as follows. First create filter file, php example below(can be any other type that will respond correctly).

 1 <?php
 2 $dbName = $argv[1];
 3 $action = $argv[2];
 4 $objectName = $argv[3];
 5 
 6 switch ($action) {
 7    case "test":
 8        // control value
 9        exit(123);
10    case "data":
11        // restrict what data we want to import
12        if ($objectName[0]=="_") {
13            exit(0);
14        }
15 
16        if (substr($objectName, 0, 3)=="log_") {
17            exit(0);
18        }
19 
20        break;
21 }
22 
23 exit(1);

lines 2-4 setup variables to receive input

lines 7-9 in order for filter file to be accepted it needs correctly respond to test challenge (response must be '123')

lines 12,16 add conditions that checks if current object should be skipped

lines 13,17 exist with code 0 to skip operation on given object

line 23 exist with code 1 to continue operations on current object

To run it you can use this command: ini php -f ~/backup/restore.php -- --drop-db -h localhost -u root -p -F "php -f ~/backup/filter.php " ~/backup/

Of course you can run script on the backup to remove actual data from backup before you download it. You could go about in this way:

php -f ~/backup/restore.php -- --clone-to "~/backup/filtered" -F "php -f ~/backup/filter.php " ~/backup/