excel csv to mysql-numbers >1000 becomes 1.00 - mysql

While uploading excel csv to mysql i find that '1000' becomes 1.00 and max value remains 999.99. While uploading same table with microsoft words spreadsheet it is fine but it has limitation of numbers of rows.how i can upload large table to mysql, i have csv data in .text files that has to be converted to .csv

It might be due to the thousand separator. Make sure that in your CSV, you see "1000" and not "1,000".
Also, see mySQL load data local infile incorrect number format, it might help you out.

The problem is either the formatting in your excel file or the limitations of numbers in your mysql table. Can you upload the file to google docs and show us the query which is used to create the table (alternatively - a screenshot of the table's structure)?

Related

Google Big Query Wildcard Data sets

I have 45 CSV files stores in google cloud storage in the same folder, when wild carding these into a data set table I am finding that some of the rows are missing once I connect the data to tableau. If I just select one of the files all the data will appear. all the files are called "PMPRO_PIVOT_ASDKE" the last 5 digits will change for each file. I have tried wildcarding with "PMPRO_PIVOT*" and it will take data from each file but some of the data is missing from each file.
Any ideas would be great as I've been trying to solve this all day.

searching for an option to compare data of an sql file with a txt file in csv form

I got a Problem, cause I'm totally new to sql and have to kinda learn it in an internship. So I had to import huge txt files into a database in phpmyadmin (took me for ever but managed it with load data infile). Now my task is to find a way to control if the data of the tables is the same as the data of the given txt files.
Is there any possibility to do so ?
Have you tried exporting the data through phpMyAdmin using a different file format instead of .sql? phpMyAdmin gives you several choices including CSV, OpenOffice spreadsheets. That would make your compare easier. You could use Excel, sort the data and you'd have a quicker compare.
the best way to do so is to load, and then extract.
Compare your extract with the original file.
Another way could be to count the number of line in both table and file. And extract few lines, and verify that they both exists. This is less precise.
But this has nothing to do with SQL. It is just a test logic.

When SQL Server 2008 query results are exported to CSV file extra rows are added

When I am exporting my query results from SQL Server 2008 to CSV or Tab Delimited txt format I always end up seeing extra records (that are not blank) when I open the exported file in Excel or import it into Access.
The SQL query results return 116623 rows
but when i export to csv and open with excel i see 116640 records. I tried importing the csv file into access and i also see extra records.
The weird thing is that if i add up the totals in excel up to row 116623 I have the correct totals meaning i have the right data up to that point but the extra 17 records after that are bad data which i don't know how it is being added.
Does anyone know what might be causing these extra records/rows to appear at the end of my CSV file?
The way i am exporting is by right clicking on the results and export to csv (comma delimited) or txt (tab delimited) files and both are causing the problem.
I would bet that in those huge number of rows you have some data that had a carriage return internal to the record (such as an address record that includes a line break). look for rows that have empty data in some fo the columns you would expect data in. I ususally reimport the file to a work table (with an identity so you can identify which rows are near the bad ones) and then run queries on it to find the ones that are bad.
Actually, there is a bug in the export results as feature. After exporting the results, open csv file in a Hex editor and look up unique key of last record. You will find it towards the end of the file. Find the OD OA for that record and delete everything else that follows. It's not Excel or Access. For some reason SQL just can't export a csv without corrupting the end of the file.

Import Excel data to SQL Server

What is the correct way of importing data from Excel 2007 file to SQL Server database 2008? The data from excel file should be transferred successfully even if the data in Excel file is changed / replaced with different data except the column name at first row.
Excel is very finicky about how you remove rows from it. Your best bet is to select all rows below the column headers and right click and delete. If you simply clear contents and then don't paste in as many rows, it looks to SSIS like you have rows with nulls in them which sometimes causes the types of errors you are seeing.

How to import an excel file in to a MySQL database

Can any one explain how to import a Microsoft Excel file in to a MySQL database?
For example, my Excel table looks like this:
Country | Amount | Qty
----------------------------------
America | 93 | 0.60
Greece | 9377 | 0.80
Australia | 9375 | 0.80
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.
(Disclaimer: I help run SQLizer)
Below is another method to import spreadsheet data into a MySQL database that doesn't rely on any extra software. Let's assume you want to import your Excel table into the sales table of a MySQL database named mydatabase.
Select the relevant cells:
Paste into Mr. Data Converter and select the output as MySQL:
Change the table name and column definitions to fit your requirements in the generated output:
CREATE TABLE sales (
id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
Country VARCHAR(255),
Amount INT,
Qty FLOAT
);
INSERT INTO sales
(Country,Amount,Qty)
VALUES
('America',93,0.60),
('Greece',9377,0.80),
('Australia',9375,0.80);
If you're using MySQL Workbench or already logged into mysql from the command line, then you can execute the generated SQL statements from step 3 directly. Otherwise, paste the code into a text file (e.g., import.sql) and execute this command from a Unix shell:
mysql mydatabase < import.sql
Other ways to import from a SQL file can be found in this Stack Overflow answer.
Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
There are actually several ways to import an excel file in to a MySQL database with varying degrees of complexity and success.
Excel2MySQL. Hands down, the easiest and fastest way to import Excel data into MySQL. It supports all verions of Excel and doesn't require Office install.
LOAD DATA INFILE: This popular option is perhaps the most technical and requires some understanding of MySQL command execution. You must manually create your table before loading and use appropriately sized VARCHAR field types. Therefore, your field data types are not optimized. LOAD DATA INFILE has trouble importing large files that exceed 'max_allowed_packet' size. Special attention is required to avoid problems importing special characters and foreign unicode characters. Here is a recent example I used to import a csv file named test.csv.
phpMyAdmin: Select your database first, then select the Import tab. phpMyAdmin will automatically create your table and size your VARCHAR fields, but it won't optimize the field types. phpMyAdmin has trouble importing large files that exceed 'max_allowed_packet' size.
MySQL for Excel: This is a free Excel Add-in from Oracle. This option is a bit tedious because it uses a wizard and the import is slow and buggy with large files, but this may be a good option for small files with VARCHAR data. Fields are not optimized.
Not sure if you have all this setup, but for me I am using PHP and MYSQL. So I use a PHP class PHPExcel. This takes a file in nearly any format, xls, xlsx, cvs,... and then lets you read and / or insert.
So what I wind up doing is loading the excel in to a phpexcel object and then loop through all the rows. Based on what I want, I write a simple SQL insert command to insert the data in the excel file into my table.
On the front end it is a little work, but its just a matter of tweaking some of the existing code examples. But when you have it dialed in making changes to the import is simple and fast.
the best and easiest way is to use "MySQL for Excel" app that is a free app from oracle. this app added a plugin to excel to export and import data to mysql. you can download that from here
When using text files to import data, I had problems with quotes and how Excel was formatting numbers. For example, my Excel configuration used the comma as decimal separator instead of the dot.
Now I use Microsoft Access 2010 to open my MySql table as linked table. There I can simply copy and paste cells from Excel to Access.
To do this, first install the MySql ODBC driver and create an ODBC connection.
Then in access, in the "External Data" tab, open "ODBC Database" dialog and link to any table using the ODBC connection.
Using MySql Workbench, you can also copy and paste your Excel data into the result grid of MySql Workbench. I gave detailed instructions in this answer.
Fastest and simpliest way is to save XLS as ODS (open document spreasheet) and import it from PhpMyAdmin
For a step by step example for importing Excel 2007 into MySQL with correct encoding (UTF-8) search for this comment:
"Posted by Mike Laird on October 13 2010 12:50am"
in the next URL:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
You could use DocChow, a very intuitive GIU for importing Excel into MySQL, and it's free on most common platforms (including Linux).
More especially if you are concerned about date, datetime datatypes, DocChow easily handles datatypes. If you are working with multiple Excel spreadsheets that you want to import into one MySQL table DocChow does the dirty work.
Step 1 Create Your CSV file
Step 2 log in to your mysql server
mysql -uroot -pyourpassword
Step 3
load your csv file
load data local infile '//home/my-sys/my-excel.csv' into table my_tables fields terminated by ',' enclosed by '"' (Country, Amount,Qty);
Another useful tool, and as a MySQL front-end replacement, is Toad for MySQL. Sadly, no longer supported by Quest, but a brilliant IDE for MySQL, with IMPORT and EXPORT wizards, catering for most file types.
If you are using Toad for MySQL steps to import a file is as follows:
create a table in MySQL with the same columns that of the file to be imported.
now the table is created, goto > Tools > Import > Import Wizard
now in the import wizard dialogue box, click Next.
click Add File, browse and select the file to be imported.
choose the correct dilimination.("," seperated for .csv file)
click Next, check if the mapping is done properly.
click Next, select the "A single existing table" radio button also select the table that to be mapped from the dropdown menu of Tables.
Click next and finish the process.
If you don't like plugins, VBA and external tools, I have an excel file that using formulas only allows you to create INSERT/UPDATES. You only have to put the data on the cells:
As an extra, there's another tab in the file to CREATE TABLES:
The file can be found on the following link:
EXCEL FILE
I've had good results with the Tools / Import CSV File feature in HeidiSQL, with CSV files directly exported from Excel 2019 with "Save As..."
It uses LOAD DATA INFILE internally but with a GUI interface and also analyzes the CSV file before passing it to LOAD DATA INFILE so it can, for example, create the table using the first row as column names and guessing the column data type (<New table> option as shown in the picture)