Welcome to the Treehouse Community
Looking to learn something new?
Load large csv files into a mysql database
I have written code at the moment that loops through a csv file and inserts each row into a table but this is not the best approach when the file is large as the system has to run thousands of insert statements. Any one any ideas as to how this can be done?? I have tried the LOAD DATA INTO statement but that keeps throwing erros
Use "mysqlimport" command. It works fast and is suited for large CSV files.
mysqlimport --ignore-lines=1 \ --fields-terminated-by=, \ --local -u root \ -p Database \ TableName.csv
Thanks MIguel I got everything working fine but hit another problem. Your ok if you just run normal select statements but when you try and run any statement with a where clause mysql returns the same record twice. My project reached a dead end as the hosting providers don't allow you to use the mysql import command for security reasons as apache needs settings changed for it to work. Thanks anyway