Import Only Matched Data From CSV to MySQL - mysql

i need to import some data from very huge csv file which is about 1GB.
instead of importing all, i want to just import matched data, i think it will be more easy and faster than importing all data.
i need to search "Post Code District" column of CSV file, if it contains LS1 or LS2 or LS10, import matched data into tabel in SQL?

Misconception. You think that filtering a text file against a database table is going to be faster than just loading the entire file into the database.
I support there are extreme cases where this might be true. But, in general, the safest way to handle these types of situation is:
Import the file into a staging table.
Add indexes, as necessary to the staging table for performance.
Run a query to copy the data you want from the staging table.
I could phrase this a different way. In the time it would take you to figure out how to efficient combine information from the file and a database table, you could probably go through the above process 10-50 times.

Related

How to import a CSV file into MySQL tables, filtering only rows that meet a condition?

I have to import a CSV file into MySQL table. The file and the table have the same structure (columns and types of values). The import should happen with following algo:
read a line from the file,
compare values from column X (file) with column X (db table),
import line, if the value in the file is higher, than the value in the db table.
The question is, slightly off-topic: how is it possible to automate this kind of import? Are there tools doing such? Or is PhpMyAdmin enough - create temporal table, put the file into it, compare and import?
For everybody, who is looking for same or similar solutions:
The tool, which helps to accomplish such tasks, is calles webyog (no link - its not an advertising, search for it),
The tool has possibility to specify SQL query, which will rule the import.
An answer about this solutions is provided by the tool's support - should work as expected. For me, it will be the accepted solution.

Importing a CSV file into mysql. (Specifically about create table command)

I hava text file full of values like this:
The first line is a list of column names like this:
col_name_1, col_name_2, col_name_3 ......(600 columns)
and all the following columns have values like this:
1101,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1101,1,3.86,65,0.46418,65,0.57151...
What is the best way to import this into mysql?
Specifically how to come up with the proper CREATE TABLE command so that the data will load itself properly? What is the best generic data type which would take in all the above values like 1101 or 3.86 or 0.57151. I am not worried about the table being inefficient in terms of storage as I need this for a one time usage.
I have tried some of the suggestions in other related questions like using Phpmyadmin (it crashes I am guessing due to the large amount of data)
Please help!
Data in CSV files is not normalized; those 600 columns may be spread across a couple of related tables. This is the recommended way of treating those data. You can then use fgetcsv() to read CSV files line-by-line in PHP.
To make MySQL process the CSV, you can create a 600 column table (I think) and issue a LOAD DATA LOCAL INFILE statement (or perhaps use mysqlimport, not sure about that).
The most generic data type would have to be VARCHAR or TEXT for bigger values, but of course you would lose semantics when used on numbers, dates, etc.
I noticed that you included the phpmyadmin tag.
PHPMyAdmin can handle this out of box. It will decide "magically" which types to make each column, and will CREATE the table for you, as well as INSERT all the data. There is no need to worry about LOAD DATA FROM INFILE, though that method can be more safe if you want to know exactly what's going on without relying on PHPMyAdmin's magic tooling.
Try convertcsvtomysql, just upload your csv file and then you can download and/or copy the mysql statement to create the table and insert rows.

Import multiple (100+) Excel files into MySQL Table

I have 100+ XLSX files that I need to get into a MySQL Database. Each file is a bit different, so I've created a massive table that contains a field for each possible column header across all files. This way they can auto-map on import.
Using Navicat I can import the files one at a time, but I'm wondering if there is a way to import all of the files at one time?
I'm thinking 'no' but if your going to do this a lot I think there are ways to automate this.
Export your xls-files into csv-files instead. Write some script to transform them between csv-excel style to csv-mysql style.
Create tables for importing with the csv engine. Put your files inplace of the ones created by mysql and flush tables. Now your data is ready to be read inside mysql and copied over to more powerful tables engines.
Another way is to do a VBA-script that exports the data in a format recognized by load data infile and then load them using mysql.

SSIS: adding a column to a destination table after import

I have an SSIS package that I used to import excel data, I need to add a column to the table when I import the data, however, the column, is contrived data joined from another table that already exists in the sql server database.
Does anybody know how I would even begin to do this?
I've tried "derived column" however, the data that populates the column is not derived from the source excel data, rather, a join from the data to that other table.
Thanks
You can use a lookup in addition to the methods from #HLGEM
I know of two methods. One you can use a merge join in the dataflow. This tends to be a slow because you have to sort both the sources for the merge. If your data set is not large this might not be too bad though.
If your data source is large, I prefer to import the data to a work table in one data flow first. Then the datasource in the second data flow (the ones that imports to the production table) would be a query that joins the worktable to the existing table you want to grab other information from. This is more timeconsuming to set up, but here we never import anything without a work table because it maskes going back to research data import issues so much easier. It also makes it easier to clean up the data before import in my opinion as I am not a fan of doing the clean up in the data flow.

CSV to MySQL conversion and import

Im working on a large project, havent had to do what I need help with before. I have a csv file, it contains a large amount of data, namely all of the cities, towns, suburbs in Australia. I need to convert the csv file to sql for mysql, and then import it into the database.
What would be the best way to achieve this?
Use LOAD DATA INFILE or the equivalent command-line tool mysqlimport.
These are easy to use for loading CSV data, and this method runs 20x faster than importing rows one at a time with SQL.