We receive a csv-file for updating our order database.
Now in the file there is a tracking code. In the csv file it displays as f.i. 26E+12. This is a scientific number. When I click the cell it shows as 563200000 or something like that.
Now we are running a cronjob which imports this data into our database.
Only problem is it imports the tracking code as 26E+12. Now when we want to check the tracking code in the backend of our site a url is pasted, so you get something like: http://trackmynumber/26E+12...
This is not readable for the carrier website because they are expecting http://trackmynumber/563200000
Is there any way to convert or extract the real number from the csv-file so it imports as a number in the database?
Change the datatype of the column which gets that value to float
Related
I am trying to do what the title says and also do it for new records. I cannot link the CSV file because it exceeds the 255 limit. So i am attempting to split up the table.
I have the below table in access
DateOfTest
Time
PromptTime
TestSequence
PATResults
Logs
Serial Number
1
2
3
4
5
6
7
Obviously, where the numbers are i want the data from the CSV to be inserted.
I have created a form including a button so i can run some VBA, but i cannot find the correct information online for my work, as i am new to VBA it is also a bit confusing.
I have attempted some random code, but i was just spraying and praying at that point
I am not sure I understood your question. In the impoer tool you can choose columns, but if you want to do it with a script, I would suggest to perform pre-processing phase with simple python and pandas to read the csv file, remove any unwanted columns and save to another CSV to be uploaded directly to excel.
something like this
import pandas as pd
df = pd.read_csv ('csvfile.csv')
df.drop('column_name', inplace=True, axis=1)
df.to_excel ('filename.xlsx', index = False, header=True)
I'm pasting Tab Delimited data from Notepad++ to excel (about 50k rows and 3 columns). No matter how many different ways I try it, Excel wants to convert a cell containing one " to the next instance of " into one cell content.
For Example, if my data looked like this:
"Apple 1.0 Store
Banana 1.3 Store
"Cherry" 2.5 Garden
Watermelon 4.0 Field
The excel file looks like this:
Apple1.0StoreBanana1.3Store
Cherry 2.5GardenWatermelon4.0Field
One way to get around this is to open the file as a CSV in excel, however this leads to Excel formatting the number values to simplified ones using Excel's "General" format. So the data would look like the following:
"Apple 1 Store
Banana 1.3 Store
"Cherry" 2.5 Garden
Watermelon 4 Field
The data I'm getting is coming from SQL Server Studio so my options for file formats are:
.CSV
.Txt (Tab-delimited)
Copy Pasting from Query results
The solution I'm looking for is to have the data represented in Excel with no excel processing taking place on the quotations, numbers or any other cell contents.
Don't open the file directly in excel. Instead import it and control the data types and file layout.
Open a new excel document:
Select Data menu:
Select From Text in get External Data section.
Select file to import
On step 1 of import wizard select delimited
Click next
Select tab checkbox and change text qualifier to {none}.
Click next
Set column data types to general, text, text
Click finish.
Excel auto imports the data the best it can when you open directly in excel. You lose flexibility/control when this happens. better to import and control yourself to get the fine adjustments you're looking for.
You end up with something like this:
By treating the numbers like text, the zero's don't get messed up.
By setting the text qualifier to none, the quotes don't get messed up.
Have you tried opening it via Text Import?
Got to Data tab > From Text (third form left on default)
You will have window similar to Text To Columns.
Select correct delimiter, remember to remove the quote sign from TExt Qualifier and mark all columns as text to avoid Excel autoformatting.
Step 1:
Step 2:
Step 3:
EXCEL TIP: TIME SAVING IN IMPORTING CSV FILES INTO EXCEL: If u pre-set your Text-To-Columns delimiter parameters correctly in EXCEL (eg specify tabs as the delimiter) and then copy and paste the CSV data, Excel will import the CSV paste directly into the correct columns without u having to going through the Text-To-Columns rigmarole. This was particularly time saving when i had to import hundreds of bank statements into Excel.
However if your Text-To-Columns delimiters are pre-specified incorrectly as e.g. comma and you are importing tab delimited files then excel will dump all the data into one column, and u will have to go through the time consuming process of converting Text-To-Columns for each statement.
EXCEL LOOKS TO THE EXISTING Text-To-Columns delimiters TO SEE IF IT CAN USE THOSE TO MAKE YOUR LIFE EASIER WHEN PASTING DATA
Hope that tip helps (It saved me several hours)
I have a Excel file (.xlsx) and I am trying to import it using phpMyAdmin.
(not using .csv)
In the Excel file I have in the first row the headers of the fields from my table and the rest of the rows is the data I want to import like below..
As you can see Colum B contains the date and time. (yyyy-mm-dd hh:mm:ss)
In phpMyAdmin I have the table set up as the following:
When I now go to the 'Import' section to import the Excel file I selected the following:
I then clicked on 'Go' to import the file.
When I do that the date & time field converted to a number like the following:
Am I doing something wrong? How can I make it so that the date & time is the same as per the Excel file ?
In Excel I did format that cell to be custom yyyy-mm-dd h:mm:ss
Any ideas on why this is not importing correctly ?
(I did try saving the file as .xls but got the same result.)
The spreadsheet is preserving the data in its original format when saved and the cell format is loaded separately after that. Obviously phpmyadmin doesn't pick up this trap and loads the formatted data.
Saving your sheet as a CSV should work. You can open the CSV in notepad to confirm that the CSV version contains the correct date format.
It will work if you save file as
OpenDocument Spreadsheet(.ods) from Excel
and importing into phpMyAdmin with OpenDocument Spreadsheet format.
I have been trying to import CSV into a process node which doesnt care about rest of the fields (dynamic in count of fields as well) but for 2 or 3. But in those other fields I have date fields that is being imported in a wrong way. The field gets automatically assigned as Date20. while it is actually datetime. Also another field that is supposed to be a 16 digit character is being imported as number and is getting truncated (shows in the form 9.401153E15). After processing, this node exports the data into CSV and I see all these errors there.
I checked few links like http://www2.sas.com/proceedings/sugi30/038-30.pdf which is relevent to the topic but irrelevent in the context. How can I solve this?
PROC IMPORT for CSV simply generates datastep code, so I would recommend simply copying the datastep code into your program (it should be visible in the log) and editing it to reflect your needs.
I am trying to use an automated macro to export a Ms-Access table to a csv file. I want the destination file to have a unique name, and I reckoned that using now()yyyymmddhhnn would be a good way to achieve this.
I have got transfer text working ok from my macro, and I have set up an export file spec for the transfer.
I am using ="C:\batchfile_" & Format(Now(),"yyyymmddhhnn") & ".csv" in the filename argument in the macro. This bit works.
But when I try to run the macro, it tells me that the filename doesn't exist and then the export doesn't complete. I am not sure why this is, but I think it is because the export file specification is expecting the destination file to have the same filename and column structure as the source table.
Does anyone know a way around this?
Eric
This is very old thread, I am posting my solution so that it may be usefull for some one else
transfer text works fine, as long as variables are supplied properly, you can check for other options other than filename, datasource alternatively create using file open statement
by opening text file and convert recordset data into CSV format.