Fieldlength constraint when using linked server - sql-server-2008

I have a view in a Database that uses a Column that once had the length varchar 30 and is now updated to varchar 50.
I can execute a select directly on the database and it returns the expected results.
When I run the same query on a different server that executes the query in a linked server an error shows up:
OLE DB provider 'SQLNCLI' for linked server 'myserver' returned
data that does not match expected data length for column
'[192.168.0.107].[MyDB].[dbo].[Myview].Mycolumn'. The (maximum)
expected data length is 30, while the returned data length is 50.'
This is not an insert into a table that has a column length 30 and I have no idea where the length 30 comes from. It is realy just a select-query.

Related

Error code DFExecutorUserError / Problem with loading a CSV file from Blob storage to an Azure SQL Database

I am having trouble with importing a CSV file to Azure SQl Database, it dives me the following error:
Error code: DFExecutorUserError;
Failure type: User configuration issue;
Details: Job failed due to reason: The given value of type NVARCHAR(360) from the data source cannot be converted to type nvarchar(50) of the specified target column.
Source: Pipeline pipeline1;
Data flow dataflow1;
Monitor: Data flow activity Data flow1
The Dataflow consists of a source, derived column (where I convert the datatypes of a few columns from string to int and date) and a sink.
One of the Columns (Message) has a lot text on every row (most of all e-mails from customers) and on that column I have set varchar max in the Database.
Thanks in advance for the replies.
In my System I tried to reproduce the similar issue and also got the same error as you.
The main cause of error is when we move data from blob to SQL and we have already created table with small column size can trigger this issue. as we cannot enter value in column beyond its size.
To resolve this either at time of creation of table if you don't know the size of column then set it to max or ad the precopy script to sql alter column size to max and then run your pipeline.
ALTER TABLE table_name ALTER COLUMN column_name varchar(max)
Pipeline executed successfully.

How to solve error 7347 in linked server

I have two database, MySQL and SQL Server
I have created linked server between these database and I didn't have any problem but when I wanted transfer data from MySQL to SQL Server, I faced with this problem:
OLE DB provider 'MSDASQL' for linked server 'TWITTER' returned data that does not match expected data length for column '[MSDASQL].t_u_description'. The (maximum) expected data length is 30, while the returned data length is 55.
In two databases, my table and column and data type is same, but I don't know, how to solve this problem?
My data in this column(t_u_description) is non-english

SSIS moving Dirty/Failed Rows to File or Table

I have an SSIS package that gets data from a SQL SERVER 2008 TABLE (table A)on one server to a SQL SERVER 2008 table(table B) on another server and then convert the data and move it to another table.
I'm using a data flow task to do the above.
The datatype of 2 columns, column 1 and column 2 in both the tables is varchar(60) and varchar(50) respectively.
I need to import the data from table B which is the staging table to a final table(table C).The data type of the above 2 columns is of type int in table C. I'm converting the above 2 columns to int data type when I import the data, for which I have a dataflow task, in which the OLEDB source
has the following query:
SELECT CAST(COLUMN1 AS INT), COLUMN2 AS INT)
FROM TABLE B
Hi
I have 2 SQL tables i.e OLEDB Data sources , one which has the correct rows(that have been successfully converted to int) thru green arrow, and the other that has erroneous rows(that throw an error) thru red arrow.(It is configured to redirect row only on conversion error, it has "failed component" option when a truncation error occurs)the oledb destination tables has the same structure as Source Table.
When I run the data flow task that imports data from Source Table to Destination Table, I get an error
"invalid character value for cast specification" or the data conversion error.
I do not want it throw an error, but rather redirect the erroneous rows to the error table with the same structure as the destination table, with same structure has source table with error no and column name
Would it be a good idea to use a data conversion task and redirect the errors rows, by configuring an error output on conversion as well as truncation or just on conversion or cast the value and redirect rows only when conversion error occurs?
As the CAST error occurs within the SQL engine you cannot redirect those rows using SSIS. I would use a data conversion task and redirect the error rows.
FWIW I'm not a fan of "error tables" - in my experience no-one every looks at them. I prefer the "aggressive load" style (even the name is cool) where you force all rows in and make the analysts explain the discrepancies. They soon get around to fixing the serious issues as they distort their results. There will always be a low level of trivial errors that its best not to get hung up on, with life being so short and all...

ODBC linked table not showing fractions of seconds

I have linked an IBM informix database table through an ODBC connection to an Access 2010 database. My issue is that the date field in this table only shows dd/mm/yy HH:nn:ss in the Access view, where the stored data is to 1000th of a second.
I can show this in Excel 2010 but not in Access 2010: is this possible? Not having this level of accuracy is preventing me making accurate calculations!
There is a similar question on another forum here. The Date/Time field type in Access does not store fractions of seconds, and linked tables implicitly cast their columns to the corresponding Access data type, so the fractions of seconds are not available in a linked table even though they are stored in the remote database.
For example, I have a SQL Server database with a table named dbo.linkedTable that has a datetime column with fractions of seconds:
If I create a linked table in Access the [datetimeCol] is mapped to the Date/Time field type in Access and the times are rounded to the nearest second
As a workaround, I can create a Pass-Through query that uses T-SQL to convert the datetime value to a string...
SELECT ID, CONVERT(varchar, datetimeCol, 21) AS strDatetime FROM dbo.linkedTable
...returning...
...and I can parse the [strDatetime] string value to retrieve the fractional seconds.

SQL server : Db start up Validation and check the counts on few table

Creating a validation script that can be run against the archive server that confirms that databases are up and online after the refresh(SAN level) is done. The script should be small, quick but checks enough to provide good information.
Example
Is db online
Does a count of rows for a table inserted into a temp table match the count returned by a select count
Is the processdate value in the current processdate equal to today – 1