Skip to main content

SQL Import Export

Introduction

Data migration is a critical skill for any database professional. Whether you're backing up information, transferring data between systems, or loading test data into a database, knowing how to efficiently import and export SQL data is essential. This guide will walk you through the common methods, tools, and best practices for importing and exporting data in SQL databases.

SQL import and export operations allow you to:

  • Move data between different database systems
  • Create backups of your database
  • Share data with other applications
  • Migrate to new database versions
  • Load test data during development

Basic Concepts

Before diving into specific techniques, let's understand some fundamental concepts:

What is SQL Import?

Importing in SQL refers to the process of loading data from an external source (like CSV files, Excel spreadsheets, or another database) into your SQL database tables.

What is SQL Export?

Exporting in SQL refers to the process of extracting data from your SQL database and saving it in an external format that can be used by other applications or databases.

Common File Formats

The most common file formats used for SQL import and export operations include:

  • CSV (Comma-Separated Values): Simple text files where values are separated by commas
  • TSV (Tab-Separated Values): Similar to CSV, but uses tabs instead of commas
  • JSON (JavaScript Object Notation): A lightweight data-interchange format
  • XML (eXtensible Markup Language): A markup language that defines rules for encoding documents
  • SQL Scripts: Contains SQL statements to recreate database objects and data
  • Excel files: Microsoft Excel spreadsheets (.xlsx or .xls)

Let's explore how to perform import and export operations with these formats across different SQL database systems.

Exporting Data from SQL Databases

MySQL/MariaDB Export

Using the mysqldump Command Line Tool

mysqldump is a powerful utility that comes with MySQL/MariaDB installation and allows you to create backups or export data:

bash
# Export entire database
mysqldump -u username -p database_name > backup.sql

# Export specific tables
mysqldump -u username -p database_name table1 table2 > tables_backup.sql

# Export structure only (no data)
mysqldump -u username -p --no-data database_name > structure.sql

# Export data only (no structure)
mysqldump -u username -p --no-create-info database_name > data.sql

Export to CSV using SQL Commands

sql
-- Export data to CSV (from MySQL command line)
SELECT * FROM customers
INTO OUTFILE '/var/lib/mysql-files/customers.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '
';

Note: The directory path must be writable by the MySQL server process.

Using MySQL Workbench

  1. Right-click on your table in the schema navigator
  2. Select "Table Data Export Wizard"
  3. Follow the prompts to select columns and export format

PostgreSQL Export

Using pg_dump Command Line Tool

bash
# Export entire database
pg_dump -U username -F p database_name > backup.sql

# Export specific table
pg_dump -U username -t table_name database_name > table_backup.sql

# Export as compressed file
pg_dump -U username database_name | gzip > backup.sql.gz

Export to CSV

sql
-- Export data to CSV
COPY customers TO '/tmp/customers.csv' WITH CSV HEADER;

-- Export with specific delimiter
COPY customers TO '/tmp/customers.csv' WITH CSV DELIMITER '|' HEADER;

SQL Server Export

Using SQL Server Management Studio (SSMS)

  1. Right-click on your database in Object Explorer
  2. Select "Tasks" > "Export Data..."
  3. Follow the Export Wizard to select source, destination, and mapping

Using BCP Utility

bash
# Export table to a file
bcp database_name.schema_name.table_name out C:\data\output.csv -c -t, -S server_name -U username -P password

Using T-SQL Commands

sql
-- Export to CSV
EXEC sp_configure 'show advanced options', 1;
RECONFIGURE;
EXEC sp_configure 'xp_cmdshell', 1;
RECONFIGURE;

EXEC xp_cmdshell 'bcp "SELECT * FROM database.dbo.customers" queryout "C:\data\customers.csv" -c -t, -S server_name -T';

SQLite Export

bash
# Export entire database to SQL statements
sqlite3 database.db .dump > backup.sql

# Export specific table
sqlite3 database.db "SELECT * FROM table_name;" > table_data.txt

# Export to CSV
sqlite3 database.db ".headers on" ".mode csv" ".output data.csv" "SELECT * FROM table_name;"

Importing Data into SQL Databases

MySQL/MariaDB Import

Importing SQL Files

bash
# Import SQL file
mysql -u username -p database_name < backup.sql

Importing CSV Files

sql
-- Import CSV file
LOAD DATA INFILE '/path/to/customers.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '
'
IGNORE 1 ROWS; -- Skip header row if present

Using MySQL Workbench

  1. Right-click on your schema in the navigator
  2. Select "Table Data Import Wizard"
  3. Follow the prompts to select file and import options

PostgreSQL Import

Importing SQL Files

bash
# Import SQL file
psql -U username database_name < backup.sql

Importing CSV Files

sql
-- Import CSV file
COPY customers FROM '/tmp/customers.csv' WITH CSV HEADER;

-- Import with specific delimiter
COPY customers FROM '/tmp/customers.csv' WITH CSV DELIMITER '|' HEADER;

-- Import with column specification
COPY customers(first_name, last_name, email) FROM '/tmp/customers.csv' WITH CSV HEADER;

SQL Server Import

Using SQL Server Management Studio (SSMS)

  1. Right-click on your database in Object Explorer
  2. Select "Tasks" > "Import Data..."
  3. Follow the Import Wizard to select source, destination, and mapping

Using BCP Utility

bash
# Import from file to table
bcp database_name.schema_name.table_name in C:\data\input.csv -c -t, -S server_name -U username -P password

Using BULK INSERT

sql
-- Import CSV file
BULK INSERT customers
FROM 'C:\data\customers.csv'
WITH (
FORMAT = 'CSV',
FIRSTROW = 2, -- Skip header row
FIELDTERMINATOR = ',',
ROWTERMINATOR = '
'
);

SQLite Import

bash
# Import SQL file
sqlite3 database.db < backup.sql

# Import CSV file
sqlite3 database.db ".mode csv" ".import /path/to/customers.csv customers"

Practical Examples

Example 1: Database Migration Between Different SQL Systems

Let's say you want to migrate a database from MySQL to PostgreSQL:

Step 1: Export data from MySQL

bash
mysqldump -u root -p --compatible=postgresql my_database > mysql_dump.sql

Step 2: Modify the SQL file for PostgreSQL compatibility You might need to manually edit the SQL file or use tools like pgloader to handle compatibility issues.

Step 3: Import into PostgreSQL

bash
psql -U postgres new_database < mysql_dump.sql

Example 2: Creating and Loading a Test Database

When developing applications, you often need to populate your database with test data:

Step 1: Create export of production schema (structure only)

bash
mysqldump -u username -p --no-data production_db > schema.sql

Step 2: Create sample data in CSV format

csv
id,name,email,signup_date
1,John Doe,[email protected],2023-01-15
2,Jane Smith,[email protected],2023-02-20
3,Bob Johnson,[email protected],2023-03-05

Step 3: Import schema to test database

bash
mysql -u username -p test_db < schema.sql

Step 4: Import sample data

sql
LOAD DATA INFILE '/path/to/sample_users.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '
'
IGNORE 1 ROWS;

Example 3: Regular Database Backups

Automating regular database backups is a critical practice:

Step 1: Create a backup script (backup.sh)

bash
#!/bin/bash
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
BACKUP_DIR="/var/backups/database"
DB_USER="username"
DB_PASSWORD="password"
DB_NAME="database_name"

# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR

# Create the backup
mysqldump -u $DB_USER -p$DB_PASSWORD $DB_NAME | gzip > $BACKUP_DIR/backup_$TIMESTAMP.sql.gz

# Delete backups older than 30 days
find $BACKUP_DIR -name "backup_*.sql.gz" -type f -mtime +30 -delete

Step 2: Make the script executable

bash
chmod +x backup.sh

Step 3: Schedule with cron

bash
# Add to crontab to run daily at 2 AM
0 2 * * * /path/to/backup.sh

Best Practices for SQL Import and Export

1. Always Back Up Before Importing

Before importing data, especially into a production database, always create a backup. This provides a safety net in case anything goes wrong.

2. Validate Data Before Importing

Check your data for inconsistencies, missing values, or incorrect formats before importing. Consider loading into a staging table first for validation.

3. Use Transactions for Large Imports

Wrap your import operations in transactions so you can roll back if errors occur:

sql
BEGIN TRANSACTION;

-- Import operations here

-- If everything is successful
COMMIT;

-- If there are errors
-- ROLLBACK;

4. Consider Performance for Large Datasets

For very large datasets:

  • Disable indexes before import and rebuild them afterward
  • Disable foreign key constraints during import
  • Break the import into smaller batches
  • Consider using bulk loading tools specific to your database

5. Handle Character Encoding Correctly

Specify the correct character encoding to avoid issues with special characters:

bash
# MySQL example with UTF-8 encoding
mysqldump -u username -p --default-character-set=utf8mb4 database_name > backup.sql

6. Document Your Import/Export Procedures

Keep detailed documentation of your import/export procedures, including:

  • Source and destination database details
  • Field mappings
  • Transformation rules
  • Validation criteria

Troubleshooting Common Issues

Permission Errors

If you encounter permission errors during export or import:

  • Check file system permissions
  • Verify database user permissions
  • Look for secure-file-priv restrictions in MySQL
sql
-- Check secure-file-priv setting in MySQL
SHOW VARIABLES LIKE 'secure_file_priv';

Data Type Incompatibilities

Different databases handle data types differently. Watch for:

  • Date/time format differences
  • String length limitations
  • Numeric precision differences

Character Encoding Issues

If you see strange characters after import:

  • Verify source and destination encoding settings
  • Use explicit encoding specifications in your import/export commands

Advanced Topics

Handling Large Datasets with Streaming

For extremely large datasets, consider streaming approaches:

python
# Python example using pandas for chunked processing
import pandas as pd

# Read CSV in chunks
for chunk in pd.read_csv('large_file.csv', chunksize=10000):
# Process each chunk
# ...
# Insert into database
chunk.to_sql('table_name', connection, if_exists='append')

Automating Import/Export with Scripts

Creating reusable scripts can save time and reduce errors:

python
# Python example with SQLAlchemy
from sqlalchemy import create_engine
import pandas as pd

# Source and destination connections
source_engine = create_engine('mysql://user:pass@localhost/source_db')
dest_engine = create_engine('postgresql://user:pass@localhost/dest_db')

# Export query
df = pd.read_sql('SELECT * FROM source_table', source_engine)

# Transform data if needed
# df['column'] = df['column'].apply(some_transformation)

# Import to destination
df.to_sql('destination_table', dest_engine, if_exists='replace', index=False)

Summary

SQL import and export operations are essential skills for database management. We've covered:

  • Basic concepts of importing and exporting data
  • Common methods for different database systems (MySQL, PostgreSQL, SQL Server, SQLite)
  • Practical examples for real-world scenarios
  • Best practices to ensure successful data transfers
  • Troubleshooting common issues
  • Advanced techniques for handling complex scenarios

By mastering these techniques, you'll be able to efficiently move data between systems, create reliable backups, and manage your databases more effectively.

Additional Resources

Exercises

  1. Basic Export/Import: Create a small table in your preferred database, export it to CSV, delete the table, and then recreate it by importing the CSV.

  2. Cross-Platform Migration: Try exporting data from one database system (e.g., MySQL) and importing it into another (e.g., PostgreSQL). Document any challenges you encounter.

  3. Automated Backup: Write a script that automatically exports a database table to a timestamped file on a schedule.

  4. Data Transformation: Export data from a database, transform it (e.g., change date formats, combine fields), and import it back.

  5. Performance Optimization: Compare the performance of different import methods (e.g., LOAD DATA INFILE vs. INSERT statements) with a large dataset.



If you spot any mistakes on this website, please let me know at [email protected]. I’d greatly appreciate your feedback! :)