18

I have a 250 MB backup SQL file but the limit on the new hosting is only 100 MB ...

Is there a program that let's you split an SQL file into multiple SQL files?

It seems like people are answering the wrong question ... so I will clarify more:

I ONLY have the 250 MB file and only have the new hosting using phpMyAdmin which currently has no data in the database. I need to take the 250 MB file and upload it to the new host but there is a 100 MB SQL backup file upload size limit. I simply need to take one file that is too large and split it out into multiple files each containing only full valid SQL statements (no statements can be split between two files).

11 Answers11

12

Simplest way to split the backup file is to use a software sqldumpsplitter, which allows you to split the db file into multiple db files. Download here

Or else use this terminal command.

split -l 600 ./path/to/source/file.sql ./path/to/dest/file-

Here, 600 is the number of lines you wish to have in your split files. And the two arguments are source and the destination of the files respectively.

NOTE: you must check the split files, you don't split any command.

giri
  • 264
7

From How do I split the output from mysqldump into smaller files?

First dump the schema (it surely fits in 2Mb, no?)

mysqldump -d --all-databases

and restore it.

Afterwards dump only the data in separate insert statements, so you can split the files and restore them without having to concatenate them on the remote server

mysqldump --all-databases --extended-insert=FALSE --no-create-info=TRUE
harrymc
  • 498,455
5

I wrote mysqldumpsplitter (shell script), which splits the databases/tables as instructed in a quick and easy way. See all the possible use cases of how-to extract from mysqldump.

sh mysqldumpsplitter.sh --source mysqldump-file.sql --extract DB --match_str database-name
jornane
  • 1,085
2

SQLDumpSplitter3 is available for Linux, macOS and Windows. It includes a graphical user interface and a command-line interface.

On macOS the command-line interface is located in the Application Bundle.

/Applications/SQLDumpSplitter3.app/Contents/MacOS/SQLDumpSplitter3

Use SQLDumpSplitter3 to split foobar.sql into parts with 100 MB each.

SQLDumpSplitter3 split --file foobar.sql --size 100 --unit MB

This will create a folder foobar_split that contains the numbered parts.

In my experience the application works really well, even for rather large files (10 GB or more).

2

This code will do exactly what you want (and it's open source):

https://web.archive.org/web/20160313044916/http://rodo.nl/index.php?page=mysql-splitter

It allows you to split any SQL file into several smaller files (you can define the maximum size) SQL syntax will be kept correct and it works with 'multiple insert' query syntax.

Hope this helps!

DavidPostill
  • 162,382
Will0
  • 121
  • 1
0

I had this problem too and decided to code an extremely memory & CPU efficient piece of code that splits a single .sql file into several (one-per-table).

I had to write it since any other solution I found was not performing good enough. On a real 16GB dump i managed to get it splitted in less than 2 minutes.

The code and instructions are available at the project page on github

0

Another useful and 'pure' sed command to extract all dump lines for a specific database into a separate file:

sed -n '/^-- Current Database: 
`db_name`/,/^-- Current Database: `/p' 
all_databases.sql > db.sql
DarkDiamond
  • 1,919
  • 11
  • 15
  • 21
0

1) Do you have the option to upload the file by another method eg: scp or ftp and then restore it from the local file?

2) Will your ISP take the file on CD and load it for you?

3) Can you restore the file to a local server and then make a series of backup files from it using specific criteria to keep the individual sizes down?

4) You could split the file manually then tidy up the SQL commands at the end of the files?

Linker3000
  • 28,240
0

There's a couple of options if you can run a bash or perl script. Try this one from yoodey.com

#!/usr/bin/perl -w
#
# splitmysqldump - split mysqldump file into per-database dump files.
use strict;
use warnings;
my $dbfile;
my $dbname = q{};
my $header = q{};
while (<>) {    

# Beginning of a new database section:
    # close currently open file and start a new one
    if (m/-- Current Database\: \`([-\w]+)\`/) {
    if (defined $dbfile && tell $dbfile != -1) {
        close $dbfile or die "Could not close file!"
    }
    $dbname = $1;
    open $dbfile, ">>", "$1_dump.sql" or die "Could not create file!";
    print $dbfile $header;
    print "Writing file $1_dump.sql ...\n";
    }
    if (defined

$dbfile && tell $dbfile != -1) {
    print $dbfile $_;
    }
    # Catch dump file header in the beginning
    # to be printed to each separate dump file.
    if (!

$dbname) { $header .= $_; }
}
close $dbfile or die "Could not close file!"
Journeyman Geek
  • 133,878
Jrgns
  • 101
0

You can split a large file in Eclipse. I have tried a 105GB file in Windows successfully:

Just add the MySQLDumpSplitter library to your project: http://dl.bintray.com/verace/MySQLDumpSplitter/jar/

Quick note on how to import:

- In Eclipse, Right click on your project --> Import
- Select "File System" and then "Next"
- Browse the path of the jar file and press "Ok"
- Select (thick) the "MySQLDumpSplitter.jar" file and then "Finish"
- It will be added to your project and shown in the project folder in Package Explorer in Eclipse
- Double click on the jar file in Eclipse (in Package Explorer)
- The "MySQL Dump file splitter" window opens which you can specify the address of your dump file and proceed with split.
Alisa
  • 433
-1

Instead of splitting the file, you could use a MySQL client on your local machine and connect it to the remote MySQL DB. I use HeidiSQL and have found it very good.

Of course it may take a while to send the 250MB of SQL statements across the Internet.

You could also try BigDump

Hydaral
  • 1,760
  • 9
  • 11