Progress SQL error in ssis package: buffer too small for generated record - ssis

I have an ssis package which uses SQL command to get data from Progress database. Every time I execute the query, it throws this specific error:
ERROR [HY000] [DataDirect][ODBC Progress OpenEdge Wire Protocol driver][OPENEDGE]Internal error -1 (buffer too small for generated record) in SQL from subsystem RECORD SERVICES function recPutLONG called from sts_srtt_t:::add_row on (ttbl# 4, len/maxlen/reqlen = 33/32/33) for . Save log for Progress technical support.
I am running the following query:
Select max(ROWID) as maxRowID from TableA
GROUP BY ColumnA,ColumnB,ColumnC,ColumnD

I've had the same error.
After change startup-parameter -SQLTempStorePageSize and -SQLTempStoreBuff to 24 and 3000 respectively the problem was solved.
I think, for you the values must be changed to 40 and 20000.
You can find more information here. The name of the parameter in that article was a bit different than in my Database, it depends on the Progress-version witch is used.

Related

REP-57054: 'In Process Job Terminated' error while executing oracle reports

When i am executing oracle reports i got above mentioned error.
I m using query with three formula columns and generating XML for RTF.
All formula columns compiled successfully. how to resolve this issue?
Error While Executing
Workaround of setting cacheSize to 50 in $INST_TOP/ora/10.1.2/reports/conf/rwbuilder.conf while waiting for this fix.
When cacheSize is 0 in server conf file, Cache.manage() removes output files in cache directory after a request is successfully finished."
But a non-zero value disables the cache clean up functionality.
Check more details Oracle Doc ID 1237834.1

How to get the exact query generated by the ODBC driver

We are connecting Hadoop cloudera CDH distribution through ODBC driver. Queries are generated from SSRS. Few queries are working fine with parameters augmented through ? placeholder. Few other queries with parameters augmented through ? are not executing.
Error [HY000][Cloudera][ImpalaODBC] (100) error while executing a query in Impala[HY000] : AnalysisException : syntax error in line 1 where Date >= ? and Date <= ?
^Encountered : Unexpected characterExpected : Case... Exception : syntax error.
If i remove where Date >= ? and Date <= ? or supply the hard coded value then query is working perfect.
Few other queries with same filter are working perfect.
What should be recommended investigation points?
Where could i get the exact impala transformed query to investigate whether query is generated correct or not ?
You have a couple of options:
/var/log/impalad/audit stores audit logs (at least in CDH). Those logs contain sql_statement field that stores executed sql queries
Impala has a web server running on a 25000 port. You could connect
with your browser and see queries executed (/queries tab).
If you are using Cloudera Manager, you could see all executed impala
queries in "impala/queries"

SSIS The expression for variable 'Variable' failed evaluation. There was an error in the expression

So here we have an error I keep getting in my SSIS package but I can't see what is wrong with the statement. I have even tried another sql statement from a project that works and it still raises the error.
The system is VS 2005 running 64 bit debugger, on XP machine. The project has amongst other things a script task then a sql task, the script task outputs the month value to a variable (Dts.Variables("monthName").Value = month), which I then use to create dynamic table name in SQL statement. I haven't got to the excel sheet bit yet as I am trying to get the sql task stage working.
So i have a variable at package level called SQLTableCreate, and in that I have the properties set to:
Evaluate as Expression = true
Expression = "Create Table "+ #[user::monthName]+"(Column1 DATETIME,Column2 NVARCHAR(255),Column3 NVARCHAR(255),Column4 NVARCHAR(255),Column5 NVARCHAR(255),Column6 NVARCHAR(255),Column7 NVARCHAR(255),Column8 NVARCHAR(255),Column9 NVARCHAR(255),Column10 NVARCHAR(255))"
And when I build the package I get:
Nonfatal errors occurred while saving the package:
Error at Package: The variable "user::monthName" was not found in the Variables collection. The variable might not exist in the correct scope.
Error at Package: Attempt to parse the expression ""Create Table "+ #[user::MonthName]+"(Column1 DATETIME,Column2 NVARCHAR(255),Column3 NVARCHAR(255),Column4 NVARCHAR(255),Column5 NVARCHAR(255),Column6 NVARCHAR(255),Column7 NVARCHAR(255),Column8 NVARCHAR(255),Column9 NVARCHAR(255),Column10 NVARCHAR(255))"
" failed and returned error code 0xC00470A6. The expression cannot be parsed. It might contain invalid elements or it might not be well-formed. There may also be an out-of-memory error.Error at Package: The expression for variable "SQLTableCreate" failed evaluation. There was an error in the expression.
There is also a default SQL statement for the variable SQLTableCreate, which uses the current excel connection manager table name. When I put my dynamic statement in the expression section of properties it fills the value and valuetype property of the SQLTableCreate variable with the message:
The expression for variable "SQLTableCreate" failed evaluation. There was an error in the expression.
It's exactly as the error says
The variable "user::monthName" was not found in the Variables collection
Things in SSIS are case sensitive and Variables are one of those things. Make your expression
"Create Table "+ #[User::monthName]+"(Column1 DATETIME,Column2 NVARCHAR(255),Column3 NVARCHAR(255),Column4 NVARCHAR(255),Column5 NVARCHAR(255),Column6 NVARCHAR(255),Column7 NVARCHAR(255),Column8 NVARCHAR(255),Column9 NVARCHAR(255),Column10 NVARCHAR(255))"
Also, I hope this table design is just a sample and not real. Lack of column names and strong data types is technical debt you don't need to incur at this stage.

SSIS - Importing data into tables

When i am importing data into tables it is failing with the following error
"The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure."
I have the structures of the table like below
INSERT INTO [dbo].[BlockedPID]
([BlockedPIDID],[GroupChannelID],[RequestID],[SerialStart],[SerialEnd],[ReasonBlockedID],ChangeControl],[LastModifiedDate],[LastModifiedBy],[MessageID],[TenantID],[EventID]
,[DetailUUID])
VALUES(,,,
,,,,
,,,,,)
USE [KISDB]
GO
INSERT INTO [dbo].[RetailRuleException]
([RetailRuleExceptionID],[RequestID],[RetailRuleMasterID],[MPCMasterID],[GroupChannelIDFrom]
,[GroupChannelIDTo],[SerialStart],[SerialEnd],[RIT],[ROT],[RAT],[ActivationLimit],[ActivationOverrideLimit],[IsActive],[ChangeControl],[LastModifiedDate],[LastModifiedBy])
VALUES(,,,
,,,,
,,,,,
,,,,)
GO
We have 'ChangeControl' column with binary(8) datatype. and DFT is not able to store it in memory while importing thr data.
Could any one help me out to mitigate this problem?
Thanks,
Vijay Sharma

SSIS (2008R2) import from mssql to mysql failing due to a date column

I have an oledb connection to mssql and an ado.net destination (with odbc driver used) to mysql. The tables are exectly the same and all the columns are working bar one.
The error message received is:
[ADO NET Destination [325]] Error: An exception has occurred during data insertion, the message returned from the provider is: Unable to cast object of type 'System.DateTime' to type 'System.Char[]'.
I've seen similar questions on other data types but the resolution of changing to string does not work here. If I convert to string (has to be length 29 otherwise the conversion step fails) I get the following error message:
[ADO NET Destination [325]] Error: An exception has occurred during data insertion, the message returned from the provider is: ERROR [HY000] [MySQL][ODBC 5.1 Driver][mysqld-5.5.15]Incorrect datetime value: '2011-03-21 11:23:48.573000000' for column 'LastModificationDate' at row 1
Other potentially relevant details:
connection driver- {MySQL ODBC 5.1 Driver}
script run before dataflow - set sql_mode='STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION,ANSI_QUOTES'
Other datetime columns are working
This column has a reasonably high proportion of nulls
mssql spec: [LastModificationDate] [datetime] NULL
mysql spec: LastModificationDate datetime NULL
Has anyone had experience with this issue and could provide some advice on resolving it?
Can you try converting it to string on sql server side in your query using:
convert(char(10),LastModificationDate,111)+' '+convert(char(8),LastModificationDate,108)
This works for me all the time.
I got the same big headache this week. I tried many ways. Thanks God, finnally, one of them worked. Hope it could help you a little bit.
For some columns with the data type of Int, datetime, decimal....,here, I identified as ColumnA, and I used it as datetime type.
1.in Data Flow Source, use SQL Command to retrieve data. Sth like select isnull(ColumnA,'1800-01-01') as ColumnA, C1, C2, ... Cn from Table
Make sure to use Isnull function for all columns with the datatype mentioned before.
2.Excute the SSIS pkg. It should work.
3.Go back to Control Flow, under the data flow task, add SQL Task control to replace the data back. I mean, update the ColumnA from '1800-01-01' to null again.
That works for me. In my situation, I cannot use ignore failure option. Because if I do, I will lose thousands rows of data.