In this article, we will walk through the process of migrating a large WordPress site, focusing on both the large database (i.e. over 40 GB) and the large uploads directory size (i.e. over 100 GB). We will assume that you have shell access to both the old and new servers and that tools such as mydumper
and rsync
will be used for the migration. The steps outlined here will help ensure that the migration is done efficiently, especially when dealing with large amounts of data.
Prerequisites:
- Shell access to both the old and new servers.
- Ensure
mydumper
andrsync
are installed on both servers. - Basic knowledge of PHP, SQL, and Bash.
Key Details:
- Old domain:
www.old-site.com
- New domain:
www.new-site.com
- Old database name:
olddb
- New database name:
newdb
Step 1: Verify Tools Installation
First, check if mydumper
and rsync
are installed on both servers. If not, you can install them using the following commands:
Check for MyDumper and Rsync:
# Check if mydumper is installed
mydumper --version
# Check if rsync is installed
rsync --version
Install MyDumper:
# On Ubuntu-based systems
sudo apt-get install mydumper
# On CentOS-based systems
sudo yum install mydumper
Install Rsync:
# On Ubuntu-based systems
sudo apt-get install rsync
# On CentOS-based systems
sudo yum install rsync
Step 2: Export the Database in Smaller Compressed Chunks
Use mydumper
to export the large database from www.old-site.com
in smaller, compressed chunks. mydumper
is particularly useful because it allows parallel exports and compression.
Export Command:
# On the old site
mydumper --database=olddb --outputdir=/path/to/export --compress --threads=4 --chunk-filesize=100
--threads=4
: Number of threads for parallelism.--chunk-filesize=100
: Each chunk file size will be around 100MB.--compress
: Compresses each chunk during export.
Step 3: Transfer the Exported Database to the New Server
After the export is complete, use rsync
to transfer the database chunks from www.old-site.com
to www.new-site.com
.
Rsync Command:
# On the old site
rsync -avz /path/to/export/ user@new-site.com:/path/to/new/location
-a
: Archive mode.-v
: Verbose.-z
: Compress during transfer.
Step 4: Import the Database into the New Server
On www.new-site.com
, use the myloader
command to import the database chunks into the new database (newdb
).
Import Command:
# On the new site
myloader --directory=/path/to/new/location --database=newdb --threads=4 --overwrite-tables --compress
--overwrite-tables
: Overwrite any existing tables.--compress
: Automatically handles decompression of the exported files.
Step 5: Migrate the Uploads Directory in Chunks
The WordPress upload folder can be very large, so it’s efficient to split it into smaller chunks before transferring. We will use PHP and rsync to automate this process.
PHP Script to Split the Upload Directory:
<?php
/**
* File Chunking Script
*
* This script organizes files into chunks of a specified size (500 MB by default).
* It sorts files by size and splits large files if necessary.
*
* Example scenario:
* Given three files:
* - File A: 1 KB
* - File B: 1024 MB
* - File C: 400 MB
*
* The script will process these files as follows:
*
* 1. Sort files by size: B (1024 MB), C (400 MB), A (1 KB)
*
* 2. Process File B (1024 MB):
* - Split into three parts:
* Chunk 0: FileB.part0 (500 MB)
* Chunk 1: FileB.part1 (500 MB)
* Chunk 2: FileB.part2 (24 MB)
*
* 3. Process File C (400 MB):
* - Fits in Chunk 2 with FileB.part2
* Chunk 2: FileB.part2 (24 MB) + FileC (400 MB)
*
* 4. Process File A (1 KB):
* - Added to Chunk 2
* Chunk 2: FileB.part2 (24 MB) + FileC (400 MB) + FileA (1 KB)
*
* Final Result:
* - Chunk 0 (500 MB): FileB.part0
* - Chunk 1 (500 MB): FileB.part1
* - Chunk 2 (424 MB): FileB.part2 + FileC + FileA
*
* This approach ensures efficient use of space while keeping chunks under 500 MB.
*/
$upload_dir = '/path/to/old-site/wp-content/uploads/';
$target_dir = '/path/to/new-site/uploads/';
$chunk_size = 500 * 1024 * 1024; // 500MB in bytes
// Function to get all files with their sizes
function getFiles($dir) {
$files = [];
foreach (new DirectoryIterator($dir) as $fileInfo) {
if ($fileInfo->isDot()) continue;
if ($fileInfo->isFile()) {
$files[$fileInfo->getPathname()] = $fileInfo->getSize();
}
}
return $files;
}
// Get all files and sort them by size (largest first)
$files = getFiles($upload_dir);
arsort($files);
$chunk_index = 0;
$current_chunk_size = 0;
$chunk_dir = $target_dir . 'chunk_' . $chunk_index;
if (!is_dir($chunk_dir)) {
mkdir($chunk_dir, 0755, true);
}
foreach ($files as $file_path => $file_size) {
$file_name = basename($file_path);
if ($file_size > $chunk_size) {
// Split large files
$fp = fopen($file_path, 'rb');
$part = 0;
while (!feof($fp)) {
$chunk_remaining = $chunk_size - $current_chunk_size;
$read_size = min($chunk_remaining, $chunk_size);
$piece = fread($fp, $read_size);
$piece_name = $file_name . '.part' . $part;
file_put_contents($chunk_dir . '/' . $piece_name, $piece);
$current_chunk_size += strlen($piece);
$part++;
if ($current_chunk_size >= $chunk_size) {
$chunk_index++;
$current_chunk_size = 0;
$chunk_dir = $target_dir . 'chunk_' . $chunk_index;
if (!is_dir($chunk_dir)) {
mkdir($chunk_dir, 0755, true);
}
}
}
fclose($fp);
} else {
// Check if file fits in current chunk
if (($current_chunk_size + $file_size) > $chunk_size) {
$chunk_index++;
$current_chunk_size = 0;
$chunk_dir = $target_dir . 'chunk_' . $chunk_index;
if (!is_dir($chunk_dir)) {
mkdir($chunk_dir, 0755, true);
}
}
copy($file_path, $chunk_dir . '/' . $file_name);
$current_chunk_size += $file_size;
}
}
echo "Chunking complete. Total chunks created: " . ($chunk_index + 1);
Run the script to copy the files in smaller chunks.
Transfer the Chunks using Rsync:
# rsync supports transferring large data by chunks and can resume transfers if interrupted due to a network error so chunking files using PHP is purely optional, that means you can skip the PHP code and just give the shell the rsync command with these three flags: -avP --partial --append
# Check if the rsync command given any error
rsync -avz --dry-run /path/to/chunk/ user@new-site.com:/path/to/new-site/chunks/
# Run the command (If you are chunking using PHP)
rsync -avz /path/to/chunk/ user@new-site.com:/path/to/new-site/chunks/
# Run the command (If you have skiped chunking using PHP)
rsync -avP --partial --append /path/to/chunk/ user@new-site.com:/path/to/new-site/chunks/
Reconstruct the chunked file(Only needed if you have done chunking using the PHP script for chunking)
<?php
/**
* File Reconstruction Script
*
* This script reconstructs files that were chunked by the previous chunking script.
* It handles both split files (with .part* extensions) and regular files.
*
* How it works:
* 1. Scans the chunk directories
* 2. Identifies split files and regular files
* 3. Reconstructs split files by combining their parts
* 4. Copies regular files to the output directory
*
* Usage:
* Set the $chunked_dir to the directory containing the chunks
* Set the $output_dir to where you want the reconstructed files to be placed
*/
$chunked_dir = '/path/to/chunked/files/'; // Directory containing the chunks
$output_dir = '/path/to/reconstructed/files/'; // Directory for reconstructed files
// Ensure output directory exists
if (!is_dir($output_dir)) {
mkdir($output_dir, 0755, true);
}
// Function to get all files in a directory and its subdirectories
function getFiles($dir) {
$files = [];
$it = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($dir));
foreach ($it as $file) {
if ($file->isFile()) {
$files[] = $file->getPathname();
}
}
return $files;
}
// Get all files in chunked directory
$chunked_files = getFiles($chunked_dir);
// Group file parts together
$file_parts = [];
$regular_files = [];
foreach ($chunked_files as $file) {
$filename = basename($file);
if (preg_match('/^(.+)\.part(\d+)$/', $filename, $matches)) {
$original_name = $matches[1];
$part_number = intval($matches[2]);
$file_parts[$original_name][$part_number] = $file;
} else {
$regular_files[] = $file;
}
}
// Reconstruct split files
foreach ($file_parts as $original_name => $parts) {
ksort($parts); // Ensure parts are in correct order
$output_file = $output_dir . $original_name;
$output_handle = fopen($output_file, 'wb');
foreach ($parts as $part) {
$input_handle = fopen($part, 'rb');
while (!feof($input_handle)) {
fwrite($output_handle, fread($input_handle, 8192)); // Copy in 8KB chunks
}
fclose($input_handle);
}
fclose($output_handle);
echo "Reconstructed: $original_name\n";
}
// Copy regular files
foreach ($regular_files as $file) {
$destination = $output_dir . basename($file);
copy($file, $destination);
echo "Copied: " . basename($file) . "\n";
}
echo "Reconstruction complete.\n";
Step 7: Update URLs and GUIDs in the WordPress Database
After migrating the files and database, update the old domain references in the new database. This includes replacing all instances of the old domain (www.old-site.com
) with the new domain (www.new-site.com
) in the database.
SQL Procedure to Change Domain:
Use the following stored procedure to update the domain across the posts
, options
, and postmeta
tables.
DROP PROCEDURE IF EXISTS wp_change_domain;
DELIMITER $$
CREATE PROCEDURE wp_change_domain(
IN new_domain VARCHAR(255),
IN old_domain VARCHAR(255),
IN prefix VARCHAR(255)
)
BEGIN
DECLARE wp_posts VARCHAR(255);
DECLARE wp_options VARCHAR(255);
DECLARE wp_postmeta VARCHAR(255);
SET wp_posts = CONCAT(prefix, 'posts');
SET wp_options = CONCAT(prefix, 'options');
SET wp_postmeta = CONCAT(prefix, 'postmeta');
SET @stmt1 = CONCAT('UPDATE ', wp_options, ' SET option_value = REPLACE(option_value, \'', old_domain, '\', \'', new_domain, '\') WHERE option_name IN (\'home\', \'siteurl\');');
SET @stmt2 = CONCAT('UPDATE ', wp_posts, ' SET guid = REPLACE(guid, \'', old_domain, '\', \'', new_domain, '\');');
SET @stmt3 = CONCAT('UPDATE ', wp_posts, ' SET post_content = REPLACE(post_content, \'', old_domain, '\', \'', new_domain, '\');');
SET @stmt4 = CONCAT('UPDATE ', wp_postmeta, ' SET meta_value = REPLACE(meta_value, \'', old_domain, '\', \'', new_domain, '\');');
PREPARE stmt1 FROM @stmt1;
PREPARE stmt2 FROM @stmt2;
PREPARE stmt3 FROM @stmt3;
PREPARE stmt4 FROM @stmt4;
EXECUTE stmt1;
EXECUTE stmt2;
EXECUTE stmt3;
EXECUTE stmt4;
DEALLOCATE PREPARE stmt1;
DEALLOCATE PREPARE stmt2;
DEALLOCATE PREPARE stmt3;
DEALLOCATE PREPARE stmt4;
END $$
DELIMITER ;
CALL wp_change_domain('www.new-site.com', 'www.old-site.com', 'wp_');
This script will update the home
, siteurl
, guid
, post_content
, and meta_value
fields.
Conclusion
Migrating a large WordPress site requires careful planning, especially when dealing with large databases and files. The combination of mydumper
, rsync
, PHP, and SQL procedures ensures that both the database and uploads folder can be transferred efficiently without overwhelming the system or network. Follow these steps closely to achieve a smooth and successful migration.
Leave a Reply