I am having an excel file with huge records. I want to import it into OLEDB destination in SSIS. While using Data Type Conversion to change the data type, One of the column "Date" is showing as "Unicode string [DT_WSTR]". How to change it to datetime?
Can some one please help me on this.
You have to use
DT_DBTIMESTAMP
if you have datetime into your DB.
For your string format:
yyyy-MM-dd hh:mm:ss
If you can get your string dates into that format, you're guaranteed to get a correct conversion.
Related
I'm attempting to use Data Factory to import a CSV file from Azure File storage into SQL Azure. So far I'm using the copy data task to read the CSV and save into SQL Azure.
My CSV file contains 2 date columns with the date in the format dd/MM/yyyy. How can I set Data Factory to validate and read dates in this format?
You can follow my step, I tried this and Data Factory can validate and read dates in "dd/MM/yyyy" format.
This is my csv file, also has two columns with the date in the format "dd/MM/yyyy".
The difference between us is I import my file from Azure blob storage into my Azure SQL database by using Copy Data.
If you want Data Factory can validate and read dates in "dd/MM/yyyy" format, during File format settings, you must set the the schema, specify the column type and the format. Please see this picture:
After copy active completed, the date with "dd/MM/yyyy" format will be parsed to the default format "yyyy-MM-dd".
Hope this helps.
I am trying to import a 9 GB CSV file which has date column as a text field.
I wanted to import in MS SQL Server with simple yyyy/mm/dd format.
But the current format is yyyy-MM-dd'T'HH:mm:ss.SSS'Z'
Example:
2017-07-01T14:59:55.711'+0000'
2017-07-01T14:59:55.711Z
Expected after importing the data:
2017-07-01
How can I achieve this?
Import the date/time as text (the default), then change the column type to 'date' afterwards.
Edit; missed a step required to truncate the imported 'thedate' string before changing the column type:
update importeddata set thedate=SUBSTRING(thedate,0,24)
I have a S3 bucket with many JSON files.
JSON file example:
{"id":"x109pri", "import_date":"2017-11-06"}
The "import_date" field is DATE type in standard format YYYY-MM-DD.
I am creating a Database connection in Athena to link all these JSON files.
However, when I create a new table in Athena and specify this field format as DATE I get: "Internal error" with no other explanation provided. To clarify, the table gets created just fine but if I want to preview it or query, I get this error.
However, when I specify this field as STRING then it works fine.
So the question is, is this a BUG or what should be the correct value for Athena DATE format?
The date column type does not work with certain combinations of SerDe and/or data source.
For example using a DATE column with org.openx.data.jsonserde.JsonSerDe fails, while org.apache.hive.hcatalog.data.JsonSerDe works.
So with the following table definition, querying your JSON will work.
create external table datetest(
id string,
import_date date
)
ROW FORMAT serde 'org.apache.hive.hcatalog.data.JsonSerDe'
LOCATION 's3://bucket/datetest'
I have come across somewhat difficult task to complete. I have an excel document with various columns and two of them are date columns with values in this format yyyy-mm-dd hh:mm:ss.
I need to import all this data in MySQL and so far tried to do it with a CSV file using phpmyadmin.
The problem is that CSV is changing the date format from yyyy-mm-dd hh:mm:ss to dd/mm/yyyy hh:mm:ss.
What other ways exist to import data originating from excel file into MySQL and keep the data format?
I would appreciate some simple solution if such exists.
Thank you!
I'm not familiar with importing Excel files into MySQL, in SQL Server this is trivial using the import wizard so I'd assume something similar exists for MySQL.
If you need to, you can convert the value to text with the format of your choosing in Excel with the following:
=TEXT(A1,"yyyy-mm-dd hh:MM:ss")
This format will persist in the csv.
Answering my question here - one possibility to export dates from excel into CSV file and keep the date format as yyyy-mm-dd or yyyy-mm-dd hh:mm:ss is to save the file as .ods document - the file format of Open Office.
From the .ods file you can save your document as CSV file and the dates are being kept as yyyy-mm-dd or yyyy-mm-dd hh:mm:ss. For some reason Open Office keeps the selected date format when saving into CSV and Excel does not.
Hope this helps someone.
my date format is in this format YYYYMMDD and when I am converting the same with data conversion task I am getting below error:
The data value cannot be converted for reasons other than sign mismatch or Data Overflow.
In my Dataconversion I have select DT_DATE
and in database the column datatype is date.
But the strange thing is that when I am executing my package and doing casting as source SELECT CAST(myDate AS DATE) package is working fine.
It's a common issue. If you use a derived column transformation, you will need to slice it out into the component parts (years, months, days) and then concatenate it back together before casting. That's ugly and time consuming for me.
Instead, assuming this is coming from a flat file, just make it a date on import by setting the type in your connection manager to the date type that will be compatible with your destination. Then on your flat file source, under advanced settings, set FastParse to true for that column. See my answer on Import String Date in Derived column for a pictoral walkthrough of it. Also addressed it on SSIS importing datetime column into SQL Server 2008