MySQL Import Basics
Introduction
Importing data is a fundamental skill for any database administrator or developer working with MySQL. Whether you're migrating data between systems, loading test data, or restoring a database backup, understanding how to efficiently import data into MySQL is essential.
In this guide, we'll explore the different methods and tools available for importing data into MySQL databases. We'll cover importing from various file formats, focusing primarily on SQL and CSV files, which are the most commonly used formats for data exchange.
Why Import Data?
Before diving into the specifics, let's understand some common scenarios where you might need to import data:
- Migrating data from one system to another
- Setting up test environments with realistic data
- Restoring database backups
- Loading initial data for a new application
- Periodically updating databases from external sources
Prerequisites
Before we begin, make sure you have:
- MySQL installed and running on your system
- Access to a MySQL user account with appropriate privileges
- Basic understanding of SQL and database concepts
- MySQL command-line client or MySQL Workbench installed
Importing SQL Files
SQL files contain SQL statements like CREATE TABLE
, INSERT
, and other database commands. They're commonly used for database backups and schema migrations.
Using the MySQL Command-Line Client
The most straightforward way to import SQL files is using the MySQL command-line client:
mysql -u username -p database_name < filename.sql
Let's break down this command:
mysql
is the command-line client-u username
specifies your MySQL username-p
tells MySQL to prompt for a passworddatabase_name
is the database where you want to import the data< filename.sql
redirects the contents of the SQL file into MySQL
Example: Importing a Database Backup
Suppose you have a backup file named company_backup.sql
and want to restore it to a database named company
:
mysql -u root -p company < company_backup.sql
Output (you'll see this after entering your password):
Enter password:
If no errors are displayed, the import was successful.
Using the source
Command
If you're already in the MySQL client, you can use the source
command:
- First, connect to your database:
mysql -u username -p database_name
- Then use the source command:
source /path/to/filename.sql;
Example:
mysql> USE company;
Database changed
mysql> source /home/user/company_backup.sql;
Query OK, 0 rows affected (0.05 sec)
...
Importing CSV Files
CSV (Comma-Separated Values) files are popular for data exchange because they're simple and can be created/edited with spreadsheet applications like Excel.
Using the LOAD DATA INFILE
Statement
MySQL's LOAD DATA INFILE
is a high-performance statement for importing CSV and other text-based data:
LOAD DATA INFILE '/path/to/file.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Parameters explained:
FIELDS TERMINATED BY ','
- specifies the character separating fields (typically comma)ENCLOSED BY '"'
- specifies the character enclosing values (typically double quotes)LINES TERMINATED BY '\n'
- specifies the line ending characterIGNORE 1 ROWS
- skips the first row (header)
Example: Importing Employee Data
Let's say we have a CSV file named employees.csv
with the following content:
id,name,department,salary
1,John Smith,Engineering,75000
2,Mary Johnson,Marketing,65000
3,James Brown,Finance,80000
First, we need a table to hold this data:
CREATE TABLE employees (
id INT PRIMARY KEY,
name VARCHAR(100),
department VARCHAR(50),
salary INT
);
Then we can import the data:
LOAD DATA INFILE '/path/to/employees.csv'
INTO TABLE employees
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Output:
Query OK, 3 rows affected (0.05 sec)
Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
Handling Permission Issues
You might encounter a common error: "The MySQL server is running with the --secure-file-priv option..."
This error occurs because MySQL restricts the directories from which files can be loaded. To resolve this:
- Check your secure-file-priv setting:
SHOW VARIABLES LIKE 'secure_file_priv';
- Either move your CSV file to the allowed directory, or use
LOCAL
keyword for client-side files:
LOAD DATA LOCAL INFILE '/path/to/employees.csv'
INTO TABLE employees
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Note: The LOCAL
keyword allows loading from the client machine but must be enabled in both server and client configurations.
Using MySQL Workbench for Imports
MySQL Workbench provides a graphical interface for importing data:
- Open MySQL Workbench and connect to your database
- Select your database in the Navigator panel
- Go to Server > Data Import
- Choose either "Import from Self-Contained File" for SQL files or "Import from Dump Project Folder"
- Select your file/folder and target schema
- Click "Start Import"
For CSV files specifically:
- Right-click on your table name in Workbench
- Select "Table Data Import Wizard"
- Follow the prompts to select your CSV file and map columns
Handling Different File Encodings
If your import file uses a character encoding different from your database's default encoding, specify it in your import command:
LOAD DATA INFILE '/path/to/file.csv'
INTO TABLE table_name
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Mapping CSV Columns to Table Columns
Sometimes your CSV file columns might not match the exact order of your table columns. You can specify the mapping:
LOAD DATA INFILE '/path/to/file.csv'
INTO TABLE employees
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(name, @dummy, department, salary, id);
This loads the first CSV column into the name
field, ignores the second column (using @dummy
), and then loads the rest in the specified order.
Importing Large Data Sets
When importing very large files (hundreds of megabytes or more), consider these performance optimizations:
- Temporarily disable constraints and indexes before import:
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE your_table DISABLE KEYS;
-- Import data here
SET FOREIGN_KEY_CHECKS=1;
ALTER TABLE your_table ENABLE KEYS;
- Consider increasing MySQL's buffer sizes temporarily:
SET GLOBAL innodb_buffer_pool_size = 268435456; -- 256MB
- Use the
LOCAL
keyword cautiously, as it can be slower for large files
Using the mysqlimport Utility
MySQL provides a dedicated command-line utility called mysqlimport
specifically for importing data:
mysqlimport --local --fields-terminated-by=',' --fields-enclosed-by='"' --lines-terminated-by='\n' --ignore-lines=1 --user=username --password database_name /path/to/file.csv
This utility is essentially a wrapper around the LOAD DATA INFILE
statement but can be more convenient in shell scripts.
Troubleshooting Import Issues
Here are some common import problems and their solutions:
Data Truncation Errors
When column values exceed the defined field length:
ERROR 1265 (01000): Data truncated for column 'column_name' at row X
Solution: Either increase the column size in your table definition or fix the data in your import file.
Duplicate Key Errors
When trying to insert a row with a primary key that already exists:
ERROR 1062 (23000): Duplicate entry 'X' for key 'PRIMARY'
Solution: Use the IGNORE
keyword to skip duplicate rows, or REPLACE
to overwrite them:
LOAD DATA INFILE '/path/to/file.csv'
IGNORE
INTO TABLE table_name
...
Format Errors
If your file format doesn't match the import command parameters:
ERROR 1262 (01000): Row 1 was truncated; it contained more data than there were input columns
Solution: Make sure your field and line terminators match the actual file format.
Real-World Application: Regular Data Updates
Here's a practical example of how you might use imports in a real-world scenario. Imagine you have an e-commerce application that receives product updates from suppliers via CSV files.
You could create a script that:
- Creates a temporary table matching your products table
- Imports the CSV into this temporary table
- Updates the main table with changes
-- Create temporary table
CREATE TEMPORARY TABLE temp_products LIKE products;
-- Import data
LOAD DATA INFILE '/path/to/products_update.csv'
INTO TABLE temp_products
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
-- Update main table (only changed records)
UPDATE products p
JOIN temp_products tp ON p.product_id = tp.product_id
SET p.price = tp.price,
p.stock_quantity = tp.stock_quantity,
p.last_updated = NOW()
WHERE p.price != tp.price OR p.stock_quantity != tp.stock_quantity;
-- Insert new products
INSERT INTO products
SELECT tp.*
FROM temp_products tp
LEFT JOIN products p ON tp.product_id = p.product_id
WHERE p.product_id IS NULL;
-- Drop temporary table
DROP TEMPORARY TABLE temp_products;
Summary
In this guide, we've covered the fundamentals of importing data into MySQL databases:
- Importing from SQL files using the command-line client and
source
command - Importing CSV data using
LOAD DATA INFILE
- Using MySQL Workbench for imports
- Handling common issues like permissions and encoding
- Managing large datasets and performance considerations
- Using the
mysqlimport
utility - Troubleshooting common import problems
- A real-world application for regular data updates
Understanding these import methods helps you effectively manage data migration and integration scenarios in your MySQL database applications.
Additional Resources
Here are some resources to further develop your MySQL import skills:
Exercises
- Create a CSV file with at least 10 records and import it into a new MySQL table.
- Export a table from an existing database as SQL, then import it into a new database.
- Create a script that imports a CSV file but transforms some data during import using MySQL variables and functions.
- Research and implement a solution for scheduling regular imports using cron jobs (Linux/Mac) or Task Scheduler (Windows).
- Try importing an XML or JSON file into MySQL (hint: you'll need to transform it first or use a different approach from what we've covered).
If you spot any mistakes on this website, please let me know at [email protected]. I’d greatly appreciate your feedback! :)