MySQL for Excel TIME Column - mysql

I wanted to check in here to see if anyone has any suggestions before submitting a bug report to Oracle.
I'm using MySQL for Excel in Excel 2013, and when trying to import a table that has a TIME type column, the import fails and I get the following error message:
An error occurred when trying to import the data.
Method's type signature is not Interop compatible.
I looked up information about Method's type signature is not Interop compatible. but found only things relating to C/C#.
I'm trying to insert data that is formatted as a relative time duration, e.g. 00:02:34 (2 minutes 34 seconds), but I cannot find a way to do this. MySQL for Excel won't import a TIME column, and when I define the column as VARCHAR, it does some calculation on each value when it's submitted to the database; 00:02:34 ends up as 0.00178240740740741 in the database.
Is there a different column type that I can use that will leave a string like 00:00:00 unformatted?

That's because excel can't handle the conversion of that data type so what is being reported... the .001782... is the time in days. So if you take 24*60*60* that number you get your 2:34...
When importing to Excel you can append a single quote ' to the beginning of the string value. This kept the formatting in the cell; but it may cause problems with the data later when you have to pull the ' off for processing.

This issue in MySQL for Excel was fixed [at least partially] since version 1.3.0, when you choose to a Create Excel Table on the Advanced Options menu of the Import Dialog. (If you leave that option unchecked the error still appears.)
A bug report with a similar description of the error message you describe has been raised to the MySQL team in charge of MySQL for Excel. If you wanna receive updates on this please feel free to suscribe to Email Updates in this link: http://bugs.mysql.com/bug.php?id=72504

Related

SSIS Data conversion Package Error

This is what happens every time I try to run the package:
It appears this error is coming from a data flow task where you are trying to apply a text or Excel file source and import to a database destination. The initial errors, which are likely causing the later ones, are due to an inconsistency in the data types. Some of the source fields are defined as a Unicode where a non-Unicode is expected. The message shows this is taking place with columns VILLE, HABITATION, and PROFESSION.
This can be corrected by inserting between the source and destination data flow tasks a Data Conversion. Here you can convert the data types creating new fields that can be applied in the destination task mapping.
Hope this helps.

SSIS Errors for simple CSV Data Flow

Sorry to darken your day with my troubles, but SSIS has broken me! I am new to SSIS and I just seem to be misunderstanding it.
For background: I have a few versions of a basic package that includes a Foreach Loop container and a Data Flow with a few Derived Columns that imports CSV files into a SQL Server Staging table. It is very straightforward and does include an Execute SQL task and a File Move but those work fine. The issues are with the Foreach loop and the Data Flow.
I have one version of this package (let’s call it “A”) that seemed to be working fine. It would process multiple files in a folder, insert records into the staging table, properly execute the SQL Statements, and move the files to Archive. Everything seemed fine until I carefully QA’d the process. Turns out it was duplicating the data from one file, and never importing the data from a second Source File! Yet, the second/dupe round of data included the Source Filename (via a derived column) of the second file (but the data from the first). So it looked like I had successfully processed BOTH files until I looked at the actual data and saw that none of the values from the second source file were ever written to the Staging table.
Once I discovered this, I figured that the problem was in the Foreach loop and how I setup the different file path & name variables. So, I decided to try to make a new version of the package. I started by copying package A and created package B. In B, I deleted the Source Connection manager and created a new Connection Manager along with all new file & path variables. I then tried to cleanup/fix/replace various elements in my Data Flow and Foreach loop. In the process, I discovered that the Advanced Mappings from A – which DID work – were virtually all setup as String (even the Currency and Date columns). That did not seem right, so I modified each source money column by changing to data type Currency, and changed each date-related column to data type Date.
What followed has been dozens and dozens of Errors and I cannot get Package B to run. I have even changed all of the B data types back to String (mirroring the setup in Package A which DID work). But, still no joy.
This leads me to ask a few questions to those of you smarter than I:
1) Why can’t SSIS interpret Source CSV data using the proper data type? I.e. why do I need to set every Input column as a STRING when some columns are clearly & completely Numeric, Currency or Dates? (Yes, the Source CSV files are VERY clean – most don’t even have NULLS)
a. When I do change the Advanced mapping for a date-related Source column to Date, I get the ever present error message: [Flat File Source [30]] Error: Data conversion failed. The data conversion for column "Settle Date" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
2) When I reset the data types back to String in package B, I still get errors – usually Truncation errors (and Yes – I have adjusted the length to 250 in one of these columns).
a. Error Message: "The value could not be converted because of a potential loss of data.".
b. When I reset the Mappings to ignore the column (as a test), it throws a similar error at the next column.
3) Any ideas why Package A would dupe a file’s data and not process the second file, yet throw no errors and move both to Archive?
4) Why does the Data Viewer appear to have parsing errors (it shows data in the wrong columns) but when you use the Copy data feature in the data viewer and paste it into Excel, all of the data lines up perfectly?
5) Are there any tips & tricks that a rookie SSIS user needs to understand and which might not be apparent through the documentation and searching web articles as well as this site?
I can provide further details if they will help, but these packages are really very simple and should not be causing me this much frustration.
THANKS for any insights.
DGP
Wow seems like you have a lot of ssis issues... I think the reason for the same file being extracted is because of the the way your 'variable mappings' is defined.
Have you had a look and followed this guide:
https://www.simple-talk.com/sql/ssis/ssis-basics-introducing-the-foreach-loop-container/
Hope this helps.
Shaheen
Thanks Tab & Shaheen,
To all SSIS rookies - please learn from my mistakes!
It appears that my issue was actually in how I identified the TEXT QUALIFIER in the Connection Manager. I had entered "" and that was causing problems with how my columns were being parsed. The parsing issues caused unexpected values to appear in some of the columns and that was causing the errors in the package.
When I tried changing the the Text Qualifier to only ONE double quote - " - the whole thing worked!
As I mentioned - and as Shaheen suspected - my initial issues with the duplicate processing was probably due to how I setup the foreach loop. I had already fixed that, bit was still getting errors until I fixed the Text Qualifier.
I have only tested it a few times but it looks like that was the issue.
Thanks for the contributions.
DGP

Proc import imports wrong datatype from CSV

I have been trying to import CSV into a process node which doesnt care about rest of the fields (dynamic in count of fields as well) but for 2 or 3. But in those other fields I have date fields that is being imported in a wrong way. The field gets automatically assigned as Date20. while it is actually datetime. Also another field that is supposed to be a 16 digit character is being imported as number and is getting truncated (shows in the form 9.401153E15). After processing, this node exports the data into CSV and I see all these errors there.
I checked few links like http://www2.sas.com/proceedings/sugi30/038-30.pdf which is relevent to the topic but irrelevent in the context. How can I solve this?
PROC IMPORT for CSV simply generates datastep code, so I would recommend simply copying the datastep code into your program (it should be visible in the log) and editing it to reflect your needs.

What are the non-obvious causes of a data type mismatch while loading data in an SSIS package?

I'm very new to SSIS, so please bear with me. A developer gave me a SSIS package and asked me to create a scheduled job on our database server to run it. He says it runs on his development box but I'm seeing the job fail with the following data type mismatch error:
0xC020837F The data type of column "output column 'col1' does not match the data type "System.Byte[]" of the source column 'col1'"
I opened the package in Visual Studio, and in the Input and Output Properties of the item, it shows both the External Column and Output Column as being of data type database timestamp [DT_DBTIMESTAMP]. I checked the source column on the server and verified that it is a datetime column. Are there any other reasons this error could be thrown?
This looks like your source table definition is not the same on development and production environment. Since You didn't provide enough details about what kind of source component and what connection manager You use and what is your source query (maybe You CAST or CONVERT some data), we have to make some assumptions.
As stated in SSIS Error and Message Reference, error code 0xC020837F (-1071611009) has name DTS_E_ADOSRCDATATYPEMISMATCH and description:
The data type of "" does not match the data type "" of the source
column "__".
From error name (DTS_E_ADOSRCDATATYPEMISMATCH) and error message part "System.Byte[]" I conclude that You are probably using ADO NET Source source component.
For a start check following: open properties of source component, uncheck particular column and check it again - this forces source component to refresh external and output - this trick works for oledb source it might help You also
If that doesn't help, check following links to see if some of your source data types map to System.Byte:
Integration Services Data Types
SQL Server Data Types Mappings (ADO.NET)
Working with Data Types in the Data Flow
Probably, on either development or production environment, column is of timestamp, image, varbinary or some other type that maps to managed System.Byte[] but on the other it is not. Please recheck source tables definitions.
If this answer doesn't help You, please post create statements for your source tables as well as source query itself.

SSIS 2008 Excel Source - Problems loading Alphanumberic columns

I am using SSIS 2008 to load alphanumeric columns from Excel.
I have one column which starts off as integer
1
2
...
999
Then changes to AlphaNumeric
A1
A2
A999
When I try to load using using an Excel Data Source, excel will always say that it is an integer as it must only sample the top of the file.
(BTW - I know that I can re-order the file so that the alphas are at the top but I would rather not have to do this...)
Unfortunately you can't seem to be able to change its mind. This means that when it loads the data, it filters out the 'A' and the A999 record will update the 999 record. This is obviously not good...
I have tried to change the external and output columns to string under the advanced editing options, but I get errors and it won't run until you set the columns back to integer.
Does anyone have a solution?
SSIS uses Jet to access the Excel files. By default, Jet scans the first 8 rows of your data to determine the type of each column.
To fix it, you will need to edit the registry to increase the TypeGuessRows DWORD value of one of the following registry keys to determine how many rows to scan in your data:
It depends on what version of Windows and what version of excel ... as follows:
For 32-bit Windows
Excel 97
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\3.5\Engines\Excel
Excel 2000 and later versions
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel
For 64-bit Windows
Excel 97
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Jet\3.5\Engines\Excel
Excel 2000 and later versions
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Jet\4.0\Engines\Excel
Then, specify IMEX=1 in the connection string as follows:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=D:\abc.xls;
Extended Properties="EXCEL 8.0;HDR=YES;IMEX=1";
This information can be found in a more verbose form at: http://support.microsoft.com/kb/189897/
Wow, that looks like a huge pain. I came across a couple of examples where you could alter the connection string and sometimes get better results but they don't seem to work for everyone.
Scripting an automatic conversion to an .csv file would be a good workaround, there are a number of suggestions in this thread:
converting an Excel (xls) file to a comma separated (csv) file without the GUI
including some code in C# that you may be able to easily plop in:
http://jarloo.com/code/api-code/excel-to-csv/
here is the simiar question where altering the connection string is discussed if you want to look into it for yourself: SSIS Excel Import Forcing Incorrect Column Type
Good luck!