phpMyAdmin Import Excel spreadsheet with date and time - mysql

I have a Excel file (.xlsx) and I am trying to import it using phpMyAdmin.
(not using .csv)
In the Excel file I have in the first row the headers of the fields from my table and the rest of the rows is the data I want to import like below..
As you can see Colum B contains the date and time. (yyyy-mm-dd hh:mm:ss)
In phpMyAdmin I have the table set up as the following:
When I now go to the 'Import' section to import the Excel file I selected the following:
I then clicked on 'Go' to import the file.
When I do that the date & time field converted to a number like the following:
Am I doing something wrong? How can I make it so that the date & time is the same as per the Excel file ?
In Excel I did format that cell to be custom yyyy-mm-dd h:mm:ss
Any ideas on why this is not importing correctly ?
(I did try saving the file as .xls but got the same result.)

The spreadsheet is preserving the data in its original format when saved and the cell format is loaded separately after that. Obviously phpmyadmin doesn't pick up this trap and loads the formatted data.
Saving your sheet as a CSV should work. You can open the CSV in notepad to confirm that the CSV version contains the correct date format.

It will work if you save file as
OpenDocument Spreadsheet(.ods) from Excel
and importing into phpMyAdmin with OpenDocument Spreadsheet format.

Related

Missing rows while exporting more than 1 milliion record into csv file via SSIS

Task : Need to export 1.1 million records to a csv file
I loaded it via SSIS Dataflow.
As you can see there are 1,100,800 rows that is loaded from a table(Source) to the FlatFile location which is a CSV file.
My FlatFile destination Source filename is Test.csv
Now when i open the csv file i get the error
"file not loaded completely"
Now when i see the record at the very end of my csv file .Sorry cannot attache the csv file due to data integrity.
So i only see record till 1048578 but the row i loaded was 1,100880 so there are some missing rows and i cannot add them manually as well . See the end of the csv it does not let me type to the next row.
Any idea why?
As for workaround i loaded in to seperate csv file 1 million in 1 csv and rest in others.
But i really wanna know why it is doing this.
Thank you in advance for looking at this.
It's Excel's fault. It only supports 1,048,576 rows.
https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3
The error you're getting is because you're trying to open a .csv with more than the acceptable number of rows. Try opening the file in a different app, like Notepad++.

Paste CSV or Tab-Delimited data to excel with NO formatting

I'm pasting Tab Delimited data from Notepad++ to excel (about 50k rows and 3 columns). No matter how many different ways I try it, Excel wants to convert a cell containing one " to the next instance of " into one cell content.
For Example, if my data looked like this:
"Apple 1.0 Store
Banana 1.3 Store
"Cherry" 2.5 Garden
Watermelon 4.0 Field
The excel file looks like this:
Apple1.0StoreBanana1.3Store
Cherry 2.5GardenWatermelon4.0Field
One way to get around this is to open the file as a CSV in excel, however this leads to Excel formatting the number values to simplified ones using Excel's "General" format. So the data would look like the following:
"Apple 1 Store
Banana 1.3 Store
"Cherry" 2.5 Garden
Watermelon 4 Field
The data I'm getting is coming from SQL Server Studio so my options for file formats are:
.CSV
.Txt (Tab-delimited)
Copy Pasting from Query results
The solution I'm looking for is to have the data represented in Excel with no excel processing taking place on the quotations, numbers or any other cell contents.
Don't open the file directly in excel. Instead import it and control the data types and file layout.
Open a new excel document:
Select Data menu:
Select From Text in get External Data section.
Select file to import
On step 1 of import wizard select delimited
Click next
Select tab checkbox and change text qualifier to {none}.
Click next
Set column data types to general, text, text
Click finish.
Excel auto imports the data the best it can when you open directly in excel. You lose flexibility/control when this happens. better to import and control yourself to get the fine adjustments you're looking for.
You end up with something like this:
By treating the numbers like text, the zero's don't get messed up.
By setting the text qualifier to none, the quotes don't get messed up.
Have you tried opening it via Text Import?
Got to Data tab > From Text (third form left on default)
You will have window similar to Text To Columns.
Select correct delimiter, remember to remove the quote sign from TExt Qualifier and mark all columns as text to avoid Excel autoformatting.
Step 1:
Step 2:
Step 3:
EXCEL TIP: TIME SAVING IN IMPORTING CSV FILES INTO EXCEL: If u pre-set your Text-To-Columns delimiter parameters correctly in EXCEL (eg specify tabs as the delimiter) and then copy and paste the CSV data, Excel will import the CSV paste directly into the correct columns without u having to going through the Text-To-Columns rigmarole. This was particularly time saving when i had to import hundreds of bank statements into Excel.
However if your Text-To-Columns delimiters are pre-specified incorrectly as e.g. comma and you are importing tab delimited files then excel will dump all the data into one column, and u will have to go through the time consuming process of converting Text-To-Columns for each statement.
EXCEL LOOKS TO THE EXISTING Text-To-Columns delimiters TO SEE IF IT CAN USE THOSE TO MAKE YOUR LIFE EASIER WHEN PASTING DATA
Hope that tip helps (It saved me several hours)

Import csv data into Mysql database

We receive a csv-file for updating our order database.
Now in the file there is a tracking code. In the csv file it displays as f.i. 26E+12. This is a scientific number. When I click the cell it shows as 563200000 or something like that.
Now we are running a cronjob which imports this data into our database.
Only problem is it imports the tracking code as 26E+12. Now when we want to check the tracking code in the backend of our site a url is pasted, so you get something like: http://trackmynumber/26E+12...
This is not readable for the carrier website because they are expecting http://trackmynumber/563200000
Is there any way to convert or extract the real number from the csv-file so it imports as a number in the database?
Change the datatype of the column which gets that value to float

How to format time data when importing txt file to access using VBA?

I am importing txt file data into Access db using VBA using Transfer Text, however it read the first column as time and returns 09:11:00. How can I format the variables before or after importing the data? Thanks in advance!
data:
09:11,10
10:10,11
Sub Import()
DoCmd.TransferText acImportDelim, , "TheTable", "D:\T1.txt", True
End Sub
Before importing the data, I don't think there is a way for you to format the text since you import it all together at one go. If you need to, you may open up the file and read the text line by line to change its format.
After importing, if you need to use a formatted column of your table you can try CDATE and FORMAT function like below
CDATE(FORMAT(TheTable.field1, "dd:mm,yy"))
I assume your data is in Day:Month,Year format. If not, you can just change the format string accordingly.

Ms-Access trying to use "transfer text" to create a csv file with a unique filename

I am trying to use an automated macro to export a Ms-Access table to a csv file. I want the destination file to have a unique name, and I reckoned that using now()yyyymmddhhnn would be a good way to achieve this.
I have got transfer text working ok from my macro, and I have set up an export file spec for the transfer.
I am using ="C:\batchfile_" & Format(Now(),"yyyymmddhhnn") & ".csv" in the filename argument in the macro. This bit works.
But when I try to run the macro, it tells me that the filename doesn't exist and then the export doesn't complete. I am not sure why this is, but I think it is because the export file specification is expecting the destination file to have the same filename and column structure as the source table.
Does anyone know a way around this?
Eric
This is very old thread, I am posting my solution so that it may be usefull for some one else
transfer text works fine, as long as variables are supplied properly, you can check for other options other than filename, datasource alternatively create using file open statement
by opening text file and convert recordset data into CSV format.