SoapUI - JDBC remove table name from the response - mysql

I was using ORACLE JDBC driver all my query and assertions are based on column names. Now I need to use MySQL database instead.
When I will make request, I can see in result are table name and dot in response.
How to remove them?
If event handler, then we need to use SubmitListener.afterSubmit - but I have no idea, how to obtain response, modify it, and return.
JDBC driver config parameter sounds like is not available. I tried the Postgres driver, but not works in MySQL server.
I am trying to remove the parts marked in yellow:

Related

NIFI JDBC Connection to MariaDB UTF8 support

when we run select convert(field using utf8) from the MySQL tool (which is on Windows using ODBC) my query works fine. When I run the same query from NIFI (which is on Linux using JDBC) my query does not work. What setting am I missing?
the JDBC connection string is
jdbc:mysql://10.10.x.x:y/warehouse
We are not getting an error. The one field that is being converted is empty.
What is the full query you are running? When you use a function as a field to select from, the name returned is usually something funky, and due to the strict rules on field naming in Avro (which ExecuteSQL converts the data into), it often complains so you have to use an alias, something like select convert(field using utf8) as field from myTable.
But if you're not getting an error, I suspect maybe there's some session variable that is being set on your external MySQL tool and is not being set for NiFi's session. In the upcoming NiFi 1.9.0 release (via NIFI-5780) you'll be able to issue SET statements and such for the session, before executing your main query.
Otherwise, it's possible that the JDBC driver is returning a type to NiFi that it is not handling correctly (although for UTF8 I would expect it would report a String). I tried this with a BINARY(10) field and it works for me (using the latest master, not a released version), so I'm not sure what's going on there.

Perl5 DBI Mysql: reliable way to get last_insert_id

In my code I use database->last_insert_id(undef,undef,undef,"id"); to get the autoincremented primary key. This works 99.99% of the time. But once in a while it returns a value of 0.
In such situations, Running a select with a WHERE clause similar to the value of the INSERT statement shows that the insert was successful. Indicating that the last_insert_id method failed to get the proper data.
Is this a known problem with a known fix? or should I be following up each call to last_insert_id with a check to see if it is zero and if yes a select statement to retrieve the correct ID value?
My version of mysql is
mysql Ver 14.14 Distrib 5.7.19, for Linux (x86_64)
Edit1: Adding the actual failing code.
use Dancer2::Plugin::Database;
<Rest of the code to create the insert parameter>
eval{
database->quick_insert("build",$job);
$job->{idbuild}=database->last_insert_id(undef,undef,undef,"idbuild");
if ($job->{idbuild}==0){
my $build=database->quick_select("build",$job);
$job->{idbuild}=$build->{idbuild};
}
};
debug ("=================Scheduler build Insert=======================*** ERROR :Got Error",$#) if $#;
Note: I am using Dancer's Database plugin. Whose description says,
Provides an easy way to obtain a connected DBI database handle by
simply calling the database keyword within your Dancer2 application
Returns a Dancer::Plugin::Database::Core::Handle object, which is a
subclass of DBI's DBI::db connection handle object, so it does
everything you'd expect to do with DBI, but also adds a few
convenience methods. See the documentation for
Dancer::Plugin::Database::Core::Handle for full details of those.
I've never heard of this type of problem before, but I suspect your closing note may be the key. Dancer::Plugin::Database transparently manages database handles for you behind the scenes. This can be awfully convenient... but it also means that you could change from using one dbh to using a different dbh at any time. From the docs:
Calling database will return a connected database handle; the first time it is called, the plugin will establish a connection to the database, and return a reference to the DBI object. On subsequent calls, the same DBI connection object will be returned, unless it has been found to be no longer usable (the connection has gone away), in which case a fresh connection will be obtained.
(emphasis mine)
And, as ysth has pointed out in comments on your question, last_insert_id is handle-specific, which suggests that, when you get 0, that's likely to be due to the handle changing on you.
But there is hope! Continuing on in the D::P::DB docs, there is a database_connection_lost hook available which is called when the database connection goes away and receives the defunct handle as a parameter, which would allow you to check and record last_insert_id within the hook's callback sub. This could provide a way for you to get the id without the additional query, although you'd first have to work out a means of getting that information from the callback to your main processing code.
The other potential solution, of course, would be to not use D::P::DB and manage your database connections yourself so that you have direct control over when new connections are created.

How can i get the current schema name with Mybatis?

Basically i need to know if there is any way to get the current schema name using Mybatis.
The DB engine I'm using is MySQL
The most easy way, for which you don't even need to do anything MyBatis-specific, would simply be a query:
SELECT DATABASE();
This should, according to the documentation, return the current database.
Alternatively, you should be able to get the Configuration from your SqlSession via getConfiguration() and get it from there somewhere, perhaps from the environment which allows you access to the DataSource, but you will probably need some database-specific code there.

SSIS package creation for integrating MSSQL and MySql dbs

I am trying to create an SSIS package for integrating between MSSQL and MYSQL. I have no prior experience of working with Bids or SSIS and following the instructions from here.
I added the OLE DB Source, Lookup, Conditional Split, OLE DB Destination and OLE DB Command components to the Data Flow and configured the connection managers and column mappings upto the Conditional Split component.
From here, I am facing two problems -
1) After configuring the OLE DB Destination, it shows error symbol on the component that says could not convert between unicode and non unicode string datatypes. To solve this, I tried to insert a Data Conversion Component between the Conditional Split and the Destination and configured it for the problematic column. But that doesnt seem to help
2) While configuring the OLE DB Command, the right hand side column in Column mappings tab shows zero columns. I have added the Sql command with question marks so i guess it should be showing columns named "Param_0", "Param_1" etc if i am not wrong. I even tried to add them manually from the input and output properties tab but then it shows the warning, external columns for OLE DB command are out of sync with data source
What am I missing here ?
Thanks
The way you describe your first problem, it sounds like it should work. Here are a couple of things to check.
The data conversion component creates a new column for the converted data. Make sure you are referring to it in your following transformations and destination.
Right-click on the Data Conversion component and select Advanced Editor. Select the Input and Output Properties tab in the Advanced Editor. Expand the Data Conversion Output branch of the tree-view and select your new column. Ensure that the Data Type Properties show the data type that you want to convert too. If these values are not right then something is not right with the setup in the component.
For your second problem, the issue can frequently be caused by an error with the SqlCommand value. First, make sure the Connection Manager is correct on the Connection Manager tab. Switch to the Column Mappings tab. Near the bottom of the form, there may be a warning message that indicates that the SQL statement cannot be prepared. In other words, SSIS can't figure out what the statement is supposed to do. Address any problems with the SQL statement and switch back to the Column Mappings tab. The columns will appear once the SQL statement can be parsed.
If you want to avoid the conversion issues then change your destination table column types from char/varchar to nchar/nvarchar. I'm pretty sure you will need to use an ADO connector for mysql source and destinations, you should be able to read data from the mysql source and write to the mssql database w/o using anything other than source and destination components.

At run time how to I verify that the database schema matches my objects?

I have data access object that have been generated by SqlMetal, however the database is created by running a sql script.
Is there an easy way to verify that all table and columns names and type matches the attributes on the classes that SqlMetal created?
I guess the easiest way to do this would be to have some kind of version number hidden in a config table in your schema. Then on runtime check the version number returned.
Much easier than doing a full scan. Set the version number in your SQL script and somewhere in your data access object