I want to import an Excel file with 32,594 rows into MySQL. I have converted the file to .csv and used LOAD DATA INFILE (Source: http://blog.habrador.com/2013/01/how-to-import-large-csv-file-into-mysql.html).
However, the number of rows counted in MySQL is only 31,925. I checked whether the table was cut off during the import. But when I ordered my MySQL table by Number (PRIMARY KEY AND AUTO_INCREMENT) Desceding, the first row of the result had 32594 in the Number column.
I really do not understand why. Really hope someone could help me with this. Thanks
Related
I was given an excel (csv) sheet containing a database metadata.
I'm asking if there's a simple way to import the csv and create the tables from there?
Data is not part of this question. the csv looks like this:
logical_table_name, physical_table_name, logical_column_name, physcial_column_name, data_type, data_length
There's about 2000 rows of metadata. I'm hoping I don't have to manually create the tables. Thanks.
I don't know of any direct import or creation. However, if I had to do this and I couldn't find one, I would import the excel file into a staging table (just a direct data import). I'd make add a unique auto ID column to staging table to keep the rows in order.
Then I would use some queries to build table and column creation commands from the raw data. Unless this was something I was setting up to do a lot, I would keep it dead simple, not try and get fancy. Build individual add column commands for each column. Build a create Table command for the first row for each table. Sort them all by the order id, tables before columns. Then you should be able to just copy the script column, check the commands, and go.
I have a website feeded with large mysql tables (>50k of rows in some tables). Lets name one table "MotherTable". Every night I update the site with a new csv file (produced locally) that has to substitute "MotherTable" data.
The way I do this currently (I am not an expert, as you see), is:
- First, I TRUNCATE the MotherTable table.
- Second, I import the csv file to the empty table, with columns separated by "/" and skipping 1 line.
As the csv file is not very small, there are some seconds (or even a minute) when the MotherTable is empty, so the web users that make SELECTS on this table find nothing.
Obviously, I don't like that. Is there any procedure to update MotherTable in a way users note nothing? If not, what would be the quickest way to update the table with the new csv file?
Thank you!
I am trying to import a CSV with columns [ID, timestamp, account, liters]->[#, datetime, #, #] using the MYSQL workbench 6.3. MySQL creates the table fine using the CSV (imports 1 record) and then I reimport the same CSV for data rows. MySQL identifies the columns and matches them with table well. Then it takes a long time and reports that all rows have been imported but table shows no rows.
I have searched forums and seem people had problems with timestamps but the solutions involved using the commandline. Can someone please guide if I should format my csv differently? It's a big pain. Thanks
OK..so this was problem with date format. When I specified the datetime field, I couldn't see a field popping up at the bottom to specify the format as per https://docs.python.org/2/library/datetime.html#strftime-strptime-behavior (screen resolution issues). Anyway, I types in the matching format there and CSV is importing now..Thanks
I have to import CSV File for different clients in my system, some with [,] some [|] Etc… separated. Always very big files.
While importing I need to filter duplicate records, duplicate records should not insert in dB.
Problem is columns can be different for different clients. I have database for every client for CSV DATA which I import every day, Week or month depending on client need. I keep data for every import so we can generate reports & what data we receive in CSV file our system do processing after import.
Data structure example:
Client 1 database:
First_Name | Last_Name | Email | Phone | Ect…
95% data always same every new CSV file. Some records new comes & some records they delete from csv. so our system only process those records those newly imported .
Currently what happening we import data every time in new table. We keep table name with time stamp so we can keep track for import. it Is expensive process, it duplicate records and tables.
Im thinking solution and I need your suggestion on it.
Keeping one table every time import when I import CSV file data in table, I’ll alter existing table add new column, column name will be “current date” (byte or Boolean) add true false on import??
My other question is first time when I import CSV file …I need to write script:
While importing CSV data, if table already exists then my date logic will work else if table does not exist it should create table given or provided “client name” as table name. Challenge is columns I don’t know it should create columns from CSV file.
Table already exist some new records came in it should insert else update.
Is it do able in mysql??
although I have to do something for mssql also but right now I need solution for my sql.
Please help me... im not good in MySQL :(
You can certainly do an insert Or update statement when importing each record.
see here :
https://dev.mysql.com/doc/refman/5.0/en/insert-on-duplicate.html
I propose you create a script to dynamically create you table if it doesn't
What is the language that you would use to insert your csv?
Thanks for viewing this. I need a little bit of help for this project that I am working on with MySql.
For part of the project I need to load a few things into a MySql database which I have up and running.
The info that I need, for each column in the table Documentation, is stored into text files on my hard drive.
For example, one column in the documentation table is "ports" so I have a ports.txt file on my computer with a bunch of port numbers and so on.
I tried to run this mysql script through phpMyAdmin which was
LOAD DATA INFILE 'C:\\ports.txt" INTO TABLE `Documentation`(`ports`).
It ran successfully so I went to do the other load data i needed which was
LOAD DATA INFILE 'C:\\vlan.txt' INTO TABLE `Documentation` (`vlans`)
This also completed successfully, but it added all the rows to the vlan column AFTER the last entry to the port column.
Why did this happen? Is there anything I can do to fix this? Thanks
Why did this happen?
LOAD DATA inserts new rows into the specified table; it doesn't update existing rows.
Is there anything I can do to fix this?
It's important to understand that MySQL doesn't guarantee that tables will be kept in any particular order. So, after your first LOAD, the order in which the data were inserted may be lost & forgotten - therefore, one would typically relate such data prior to importing it (e.g. as columns of the same record within a single CSV file).
You could LOAD your data into temporary tables that each have an AUTO_INCREMENT column and hope that such auto-incremented identifiers remain aligned between the two tables (MySQL makes absolutely no guarantee of this, but in your case you should find that each record is numbered sequentially from 1); once there, you could perform a query along the following lines:
INSERT INTO Documentation SELECT port, vlan FROM t_Ports JOIN t_Vlan USING (id);