Fork me on GitHub

MYSQL快速导入本地数据

MYSQL提供了从本地文件快速导数据的命令,具体说明如下:

LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE 'file_name'
    [REPLACE | IGNORE]
    INTO TABLE tbl_name
    [CHARACTER SET charset_name]
    [{FIELDS | COLUMNS}
        [TERMINATED BY 'string']
        [[OPTIONALLY] ENCLOSED BY 'char']
        [ESCAPED BY 'char']
    ]
    [LINES
        [STARTING BY 'string']
        [TERMINATED BY 'string']
    ]
    [IGNORE number LINES]
    [(col_name_or_user_var,...)]
    [SET col_name = expr,...]

实验:导入233M文件的数据

表结构如下:

mysql> SHOW COLUMNS FROM load_file_test;
+-------+------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+------------+------+-----+---------+-------+
| mid | int(10) | YES | | NULL | |
| time | int(10) | YES | | NULL | |
| type | tinyint(4) | YES | | NULL | |
+-------+------------+------+-----+---------+-------+
3 rows in set (0.12 sec)

文件数据格式:

1450025|2|1343145600
1586865|1|1343145600
2557075|2|1343145600
2663240|2|1343145600
3787375|2|1343145600
4293640|1|1343145600

执行结果:

mysql> LOAD DATA LOCAL INFILE '/usr/local/wwwroot/texas/fansPrizeData/2012-12' INTO TABLE load_file_test FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n' (mid,type,time);
Query OK, 11116864 rows affected (2 min 25.11 sec)
Records: 11116864 Deleted: 0 Skipped: 0 Warnings: 0

 

二百多兆的数据,大概花了两分半钟,还算是比较快的。

 

参考资料:mysql dev

posted @ 2012-08-24 15:38  huan&ping  阅读(3950)  评论(0编辑  收藏  举报