How to reveal DMS SQL query logging with binding values? - mysql

I'm trying get logs created by DMS.
I read DMS documents, and successfully captured DMS's SQL logging like following:
2017-02-17T00:58:29 [TARGET_APPLY ]D: Construct statement execute internal: 'UPDATE `some_schema`.`typical_usr_master` SET `id`=? WHERE `id`=? AND `start_dt`=? ''(ar_odbc_stmt.c:3323)
However this log doesn't have original binding values, for example id or start_dt.
If they are revealed, values would be like id = "00000001", start_dt = "2017-02-17".
Do we have any chance to see such bind values on DMS logging ?
Currently, I changed all logging level to DEBUG, but only ERROR logging shows such binding values.

I received an answer from AWS support.
How to reveal DMS SQL query logging with binding values ?
Unfortunately, we do not expose bind values in logs because of security concerns.
Normally, we recommend customers to look at their target to get the actual data
values we migrated.
I'm happy if there was a way to check data integrity.

Related

How to get max_allocated_packets from JDBC connection from a Slick DB connection to MySQL

Is there a way via Scala with Slick database query and access library (or using other tricks - dare I say mocks?) to get max_allocated_packets from JDBC-read connection properties from a Slick-style DB connection to MySQL?
As I suspect, the code makes several touch type actions at deeper levels and this connection property is then populated.
Ex: Once a connection is made in com.mysql.cj.jdbc.ConnectionImpl ... using Scala with the Slick library... the value for the JDBC connect property of max_allocated_packets is within the object (debugged in IntelliJ). How can I extract this value or obtain it in higher level code as asked above?
Of course I can query the DB directly to get that value, but I am hoping I can extract this property after the setup phase.
If the value is publically available on a connection, you could try to use SimpleDBIO action to access the JDBC-level values.
See: https://scala-slick.org/doc/3.2.0/dbio.html#jdbc-interoperability
It would be of the form:
val getMaxPacketAction = SimpleDBIO[Int] { database =>
// make use of database.connection here
}
However, this is still going to the database for a connection, so it may be just easier to query for the value you want.

Does Knex.js prevent sql injection?

I'm using a MySql database and was trying to find a MySQL alternative to tedious.js (a SQL server parameterised query builder).I'm using Node.js for my backend.
I read that the .raw() command from knex.js is susceptible to sql injection, if not used with bindings.
But are the other commands and knex.js as a whole safe to use to prevent sql injection? Or am I barking up the wrong tree?
Read carefully from knex documentation how to pass values to knex raw (http://knexjs.org/#Raw).
If you are passing values as parameter binding to raw like:
knex.raw('select * from foo where id = ?', [1])
In that case parameters and query string are passed separately to database driver protecting query from SQL injection.
Other query builder methods always uses binding format internally so they are safe too.
To see how certain query is passed to database driver one can do:
knex('foo').where('id', 1).toSQL().toNative()
Which will output SQL string and bindings that are given to driver for running the query (https://runkit.com/embed/2yhqebv6pte6).
Biggest mistake that one can do with knex raw queries is to use javascript template string and interpolate variables directly to SQL string format like:
knex.raw(`select * from foo where id = ${id}`) // NEVER DO THIS
One thing to note is that knex table/identifier names cannot be passed as bindings to driver, so with those one should be extra careful to not read table / column names from user and use them without properly validating them first.
Edit:
By saying that identifier names cannot be passed as bindings I mean that when one is using ?? knex -binding for identifier name, that will be rendered as part of SQL string when passed to the database driver.

Get last cube processed date in SSIS

I need to get last processed date of SSAS cube in SSIS and save it into a variable.
I've tried a "Execute SQL task":
SELECT LAST_DATA_UPDATE as LAST_DT FROM $system.mdschema_cubes
WHERE CUBE_NAME = 'CubeName'
It works ok in MSSQL management studio MDX query window but in SSIS it says: Unsupported data type on result set binding.
Then I've tried:
WITH MEMBER [Measures].[LastProcessed] AS ASSP.GetCubeLastProcessedDate() SELECT [Measures].[LastProcessed] ON 0 FROM [CubeName]
And it says '[ASSP].[GetCubeLastProcessedDate]' function does not exist.
Any ideas how to do this?
Thank you
A linked server might be your best option;
Create the linked server with the following, changing as appropriate:
EXEC master.dbo.sp_addlinkedserver
#server = N'LINKED_SERVER_OLAP_TEST', --Change to a suitable name
#srvproduct='', --Creates the productname as blank
#provider=N'MSOLAP', --Analysis Services
#datasrc=N'localhost', --Change to your datasource
#catalog=N'TESTCUBE' --Change to set the default cube
Change the data source of your Execute SQL Task to make sure it is pointing to any of the databases where the linked server is hosted, I.E. don't use an analysis service datasource use a standard OLE DB. Then have the following in your execute SQL task (Changing as appropriate).
SELECT *
FROM OpenQuery(LINKED_SERVER_OLAP_TEST,'SELECT LAST_DATA_UPDATE as LAST_DT FROM $system.mdschema_cubes
WHERE CUBE_NAME = ''CUBENAME''')
Set the variable to be DATETIME and the result set to be single row.
There may well be other ways to do this, however I have always found this method the most straight forward.

mysql queries before insert operation by syslog-ng

I am using syslog-ng to parse some logs that I am receiving via a csv-parser. However, I want to achieve insert operations that are a bit more complex than the conventional insert using the "destination" option in syslog-ng. Currently, my destination into MYSQL from my syslog-ng conf file looks like this:
destination d_sql_test
{
sql(
type(mysql)
host('<host>')
username('<user>')
password('<pass>')
database('<db_name>')
table('test')
columns('col1')
values('${val1}')
);
};
However, this simply just inserts the contents of val1 into the column col1. I want to be able to specify my insert "logic" as shown in the example in this question.
I am unsure as to where to actually do this, and if it is even supported by syslog-ng
I think you can do this if you can somehow make the decision within syslog-ng.
You could try to use an in-list() filter to check if the username is already listed in a file. If it is not then, you can send the log into the mysql destination, and also to another destination (possibly a program() destination) that updates the file containing the list of users, and reloads the syslog-ng to update the inlist filter.
You can write a syslog-ng template-function in Python that implements the logic somehow, and for example sets a macro to 1 in the message if it should be sent to the database. Then you can use a filter for this macro in your log path with the mysql destination.
Or if you can write a separate destination that does the work in Python: Writing syslog-ng destinations in Python
Also, you might want to post this question on the syslog-ng mailing list, where the developers notice it more easily.

SELECT * FROM MySQL Linked Server using SQL Server without OpenQuery

I am trying to query a MySQL linked server using SQL Server.
The below query runs just fine.
SELECT * FROM OPENQUERY([Linked_Server], 'SELECT * FROM Table_Name')
Is it possible to run the same query without using the OpenQuery call?
Found the answer here. Now I can the three dot notation query. Thanks
http://www.sparkalyn.com/2008/12/invalid-schema-error/
Go to the provider options screenIn SQL Server 2005 you can see the list of providers in a folder above the linked server (assuming you have appropriate permissions). Right click on MSDASQL and go to properties. In SQL Server 2000, the provider options button is in the dialog box where you create the linked server.
Check the box that says “level zero only”
you can use the statement below
select * from [linkedServerName]...[databaseName.TableName]
but before executing the code above ,, you have to do some changes ..
In the SSMS
SSMS -> Expand "linked servers" Folder -> open Provider folder -> find MSDASQL and gets it's property
Then check "Level Zero Only" press Ok
Then execute the above query and Enjoy it !!!
Try like this:
SELECT * FROM [Linked_Server]...[db_name.table_name]
Working properly, however there are the problems of converting data types.
Safer and more reliable to use is OPEQUERY.
SELECT * FROM OPENQUERY([Linked_Server], 'SELECT * FROM db_name.table_name')
You should be able to simply query the linked server directly.
select * from mylinkedserver.database.schema.mytable
EDIT:
Try with the three dot notation as noted in this post:
http://www.ideaexcursion.com/2009/02/25/howto-setup-sql-server-linked-server-to-mysql/
SELECT * FROM MYSQLAPP...tables
Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "MSDASQL" for
linked server "MySQLApp" reported an error. The provider did not give
any information about the error. Msg 7312, Level 16, State 1, Line 1
Invalid use of schema or catalog for OLE DB provider "MSDASQL" for
linked server "MySQLApp". A four-part name was supplied, but the
provider does not expose the necessary interfaces to use a catalog or
schema.
This “four-part name” error is due to a limitation in the
MySQL ODBC driver. You cannot switch catalogs/schemas using dotted
notation. Instead, you will have to register another DSN and Linked
Server for the different catalogs you want to access. Be sure and
follow the three-dot notation noted in the example query.
There is an important point for using this:
SELECT * FROM [Linked_Server]...[db_name.table_name]
You must go on
Linked Server -> provider-> MSDASQL:
and make sure these three options have been checked
Dynamic Parameter
Level zero only
Allow inprocess
https://www.sqlteam.com/forums/topic.asp?TOPIC_ID=153024
This solution is great for querying small tables, however it seems that it doesn't use indexes, so getting even few rows from large tables, even by field indexed on the remote server takes ages.
So - correct me if I'm wrong - for large datasets it's still better to use OPENQUERY, as the query is evaluated and optimized on the remote server, using indexes and so on.
In case anyone is still having trouble with this...I had to go into the linked server properties -> Server Option and change RPC and RPC Out to true. Then I could run with like this [linked server]...[table]