Summary
Configuration file lookup works for Execute SQL Task, but fails for Dataflow tasks.
Problem
I have 2 databases:
Source (On-premise SQL Server database)
Destination (Azure SQL Database)
I have 2 packages that I want to create from BIML code.
1) Create Staging (Works fine)
Creates tables in destination database using for each loop and metadata from source database
2) Load Staging (Does not work)
Loads created tables in destination database using for each loop and Dataflow tasks (Source to Destination)
Both of these packages need to use a Package Configuration file that I have created, which stores the Username and Password of the Destination database (Azure database, using SQL Server Authentication).
Using this configuration file works fine for Package 1), but when I try to create the SSIS package using BIML code for Package 2) I get the following error:
Could not execute Query on Connection Dest: SELECT * FROM stg.SalesTaxRate. Login failed for user ''.
I have tried using the BIML Code for Package 1) and adding in a dataflow task and that seems to raise the same error - it seems that when using an Execute SQL Task it can find and use the Configuration file no problem, but when using a Dataflow Task it won't find it.
Script for Package 1):
<## import namespace="System.Data" #>
<## import namespace="System.Data.SqlClient" #>
<## template language="C#" tier="2" #>
<#
string _source_con_string = #"Data Source=YRK-L-101098;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016";
string _dest_con_string = #"Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False";
string _table_name_sql = "SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE'";
DataTable _table_names = new DataTable();
SqlDataAdapter _table_name_da = new SqlDataAdapter(_table_name_sql, _source_con_string);
_table_name_da.Fill(_table_names);
#>
<#+
public string RowConversion(DataRow Row)
{
string _ret = "[" + Row["COLUMN_NAME"] + "] " + Row["DATA_TYPE"];
switch (Row["DATA_TYPE"].ToString().ToUpper())
{
case "NVARCHAR":
case "VARCHAR":
case "NCHAR":
case "CHAR":
case "BINARY":
case "VARBINARY":
if (Row["CHARACTER_MAXIMUM_LENGTH"].ToString() == "-1")
_ret += "(max)";
else
_ret += "(" + Row["CHARACTER_MAXIMUM_LENGTH"] + ")";
break;
case "NUMERIC":
_ret += "(" + Row["NUMERIC_PRECISION"] + "," + Row["NUMERIC_SCALE"] + ")";
break;
case "FLOAT":
_ret += "(" + Row["NUMERIC_PRECISION"] + ")";
break;
}
return _ret;
}
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Dest" ConnectionString="Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False" />
</Connections>
<Packages>
<Package Name="005_Create_Staging_Configuration" ConstraintMode="Linear">
<PackageConfigurations>
<PackageConfiguration Name="Configuration">
<ExternalFileInput ExternalFilePath="C:\VSRepo\BIML\Configurations\AzureConfigEdit.dtsConfig">
</ExternalFileInput>
</PackageConfiguration>
</PackageConfigurations>
<Tasks>
<Container Name="Create Staging Tables" ConstraintMode="Linear">
<Tasks>
<# foreach(DataRow _table in _table_names.Rows) { #>
<ExecuteSQL Name="SQL-S_<#= _table["TABLE_NAME"] #>" ConnectionName="Dest">
<DirectInput>
IF OBJECT_ID('stg.<#= _table["TABLE_NAME"] #>','U') IS NOT NULL
DROP TABLE stg.<#= _table["TABLE_NAME"] #>;
CREATE TABLE stg.<#= _table["TABLE_NAME"] #>
(
<#
string _col_name_sql = "select COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, NUMERIC_PRECISION, NUMERIC_SCALE from INFORMATION_SCHEMA.COLUMNS where TABLE_SCHEMA='" + _table["TABLE_SCHEMA"] + "' and TABLE_NAME='"+ _table["TABLE_NAME"] + "' order by ORDINAL_POSITION ";
DataTable _col_names = new DataTable();
SqlDataAdapter _col_names_da = new SqlDataAdapter(_col_name_sql, _source_con_string);
_col_names_da.Fill(_col_names);
for (int _i=0; _i<_col_names.Rows.Count ; _i++ )
{
DataRow _r = _col_names.Rows[_i];
if (_i == 0)
WriteLine(RowConversion(_r));
else
WriteLine(", " + RowConversion(_r));
}
#>
, append_dt datetime
)
</DirectInput>
</ExecuteSQL>
<# } #>
</Tasks>
</Container>
</Tasks>
</Package>
</Packages>
</Biml>
Script for Package 2)
<## import namespace="System.Data" #>
<## import namespace="System.Data.SqlClient" #>
<## template language="C#" tier="2" #>
<#
string _source_con_string = #"Data Source=YRK-L-101098;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016";
string _dest_con_string = #"Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False";
string _table_name_sql = "select TABLE_SCHEMA , table_name from INFORMATION_SCHEMA.TABLES where TABLE_TYPE='BASE TABLE'";
DataTable _table_names = new DataTable();
SqlDataAdapter _table_name_da = new SqlDataAdapter(_table_name_sql, _source_con_string);
_table_name_da.Fill(_table_names);
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Source" ConnectionString="Data Source=YRK-L-101098;Provider=SQLNCLI11.1;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016" />
<OleDbConnection Name="Dest" ConnectionString="Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False" />
</Connections>
<Packages>
<Package Name="006_Load_Staging_Configuration" ConstraintMode="Linear">
<PackageConfigurations>
<PackageConfiguration Name="Configuration">
<ExternalFileInput ExternalFilePath="C:\VSRepo\BIML\Configurations\AzureConfigDF.dtsConfig"></ExternalFileInput>
</PackageConfiguration>
</PackageConfigurations>
<Tasks>
<Container Name="Load Staging Tables" ConstraintMode="Linear">
<Tasks>
<# foreach(DataRow _table in _table_names.Rows) { #>
<Dataflow Name="DFT-S_<#= _table["TABLE_NAME"] #>">
<Transformations>
<OleDbSource Name="SRC-<#= _table["TABLE_SCHEMA"] #>_<#= _table["TABLE_NAME"] #>" ConnectionName="Source">
<DirectInput>
SELECT *
FROM <#= _table["TABLE_SCHEMA"] #>.<#= _table["TABLE_NAME"] #>
</DirectInput>
</OleDbSource>
<OleDbDestination Name="DST-<#= _table["TABLE_SCHEMA"] #>_<#= _table["TABLE_NAME"] #>" ConnectionName="Dest">
<ExternalTableOutput Table="stg.<#= _table["TABLE_NAME"] #>"/>
</OleDbDestination>
</Transformations>
</Dataflow>
<# } #>
</Tasks>
</Container>
</Tasks>
</Package>
</Packages>
</Biml>
Configuration file:
<?xml version="1.0"?>
<DTSConfiguration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Source].Properties[ConnectionString]" ValueType="String">
<ConfiguredValue>"Data Source=YRK-L-101098;Provider=SQLNCLI11.1;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016"</ConfiguredValue>
</Configuration> <Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[ConnectionString]" ValueType="String">
<ConfiguredValue>Data Source=mpl.database.windows.net;User ID=*****;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False</ConfiguredValue>
</Configuration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[Password]" ValueType="String">
<ConfiguredValue>******</ConfiguredValue>
</Configuration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[UserName]" ValueType="String">
<ConfiguredValue>******</ConfiguredValue>
</Configuration>
</DTSConfiguration>
Note: Values for 'UserId' and 'Password' are filled with correct values in actual script.
Summary
The issue is that you are trying to use package configurations (an SSIS run-time feature) when developing the packages by generating them using Biml (a build-time feature).
What is happening?
Think of it this way. If you were to manually create the SSIS package, during development, you would have to connect to the source and destination databases by specifying user names and passwords. Without connecting to the databases, SSIS will not be able to get the metadata needed. Once the package has been developed and all the metadata has been mapped, you can use package configurations. When you open or execute a package with package configurations, all hardcoded values will be replaced by configuration values. This value replacement is the SSIS run-time feature.
Now, compare that to using Biml instead of manually creating the SSIS package: When generating the packages, you are expecting the Biml engine to get the user names and passwords from the package configuration files. Since this is an SSIS run-time feature, Biml is unable to get that data, and will use the connection strings specified in the BimlScript. Since these connection strings don't specify the user names and passwords, Biml will not be able to connect and you get the error Login failed for user ''. (This is similar to creating a connection manager in SSIS without providing a user name and password and getting an error when clicking "Test Connection".)
But it works for the Execute SQL task?
It may look like it, but it doesn't. The Execute SQL task basically just gets ignored. The SQL code in Execute SQL tasks is not checked or validated by SSIS or the Biml engine until the package is executed. You can type anything in there, and SSIS will be happy until you try to execute invalid code, at which point it will give you an error. Because this code is not validated by SSIS during development, it is not validated by Biml during package generation. The package gets generated successfully, and then when you open it, the package configurations will be applied, and you will not see any errors.
An OLE DB Destination, however, is validated by both SSIS and the Biml engine during development. They both need to connect to the database to get the metadata. This is why you get an error on this file only.
Solution
Package Configurations are an SSIS run-time feature only. You can not use them to pass connection strings, user names or passwords to the Biml engine. You can either hardcode the connection strings in your BimlScript or store the connection strings in an external metadata repository, but you will need to provide the user names and passwords to the Biml engine during package generation.
Related
RaceConditionTest.fs
namespace FConsole
module RaceConditionTest =
let test x =
...
Program.fs
open System
open FConsole
[<EntryPoint>]
let main argv =
RaceConditionTest.test 1000
0 // return an integer exit code
Then I run my console app (linux)
$ dotnet run
error FS0039: The namespace or module 'FConsole' is not defined.
There is only one test method in RaceConditionTest.fs
Is the order of files the problem? if so, How do I indicate the order of *.fs files?
as #boran suggested in his comments there is this FConsoleProject.fsproj
I just added my file before Program.fs
<ItemGroup>
<Compile Include="RaceConditionTest.fs" />
<Compile Include="Program.fs" />
</ItemGroup>
How many maximum cases are possible in conditional split Transformation in SSIS?
I don't know.
I know that using the below Biml, I was able to generate an SSIS package with a conditional split with 1024 output paths which went to their own, empty, Derived Column.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<#int upperBound = 1023;#>
<Connections>
<OleDbConnection ConnectionString="Data Source=localhost\dev2014;Initial Catalog=tempdb;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;Packet Size=32767;" Name="Source" />
</Connections>
<Packages>
<Package Name="SO_50393307">
<Tasks>
<Dataflow Name="DFT CSPL POC">
<Transformations>
<OleDbSource Name="SRC Query" ConnectionName="Source">
<DirectInput>SELECT 1 AS Col1;</DirectInput>
</OleDbSource>
<ConditionalSplit Name="CSPL Boundary Test" >
<OutputPaths>
<# foreach (int indexer in System.Linq.Enumerable.Range(0, upperBound)){#>
<OutputPath Name="Repro_<#= indexer #>"><Expression><![CDATA[Col1 == <#= indexer #>]]></Expression></OutputPath>
<#}#>
</OutputPaths>
</ConditionalSplit>
<# foreach (int indexer in System.Linq.Enumerable.Range(0, upperBound)){#>
<DerivedColumns Name="DER Anchor <#= indexer #>">
<InputPath OutputPathName="CSPL Boundary Test.Repro_<#= indexer #>" />
</DerivedColumns>
<#}#>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>
Now, just because I was able to generate that package doesn't mean that Visual Studio could edit the package. After an hour of letting VS2017 churn on opening it, I gave up.
It does however execute - took 7 seconds from the commandline.
I have a table in the database called locations which contains 2 columns(id which is auto incremented , location) , And I have a csv contains 1 column looks like :
When I try to import that file to locations table , I get this error invalid column count in CSV input on line 1.
Also I tried CSV using LOAD DATA but I get this : MySQL returned an empty result set (i.e. zero rows)
Maybe you should use:
$array = array_map('str_getcsv', file('file.csv'));
You have more option to check variables.
If Java is supported in your machine, Use below solution
https://dbisweb.wordpress.com/
The simple configuration required to do is
<?xml version="1.0" encoding="UTF-8"?>
<config>
<connections>
<jdbc name="mysql">
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://localhost:3306/test1</url>
<user>root</user>
<password></password>
</jdbc>
</connections>
<component-flow>
<execute name="file2table" enabled="true">
<migrate>
<source>
<file delimiter="," header="false" path="D:/test_source.csv"/>
</source>
<destination>
<table connection="mysql">locations</table>
</destination>
<mapping>
<column source="1" destination="location" />
</mapping>
</migrate>
</execute>
</component-flow>
</config>
If you are interested, Same can be achieved by Java code.
Source source = new File("D:/test_source.csv", ',', false);
Destination destination = new Table("locations", new JDBCConnection("com.mysql.jdbc.Driver", "jdbc:mysql://localhost:3306/test1", "root", ""));
List<Mapping> mapping = new ArrayList<>();
mapping.add(new Mapping("1", "location", null));
Component component = new MigrateExecutor(source, destination, mapping);
component.execute();
I have been working on a simple BIML solution to start learning how to use it. I keep getting an error message:
Supplied connections must be of type AstDbConnectionNode for this method.
at Varigence.Biml.Extensions.ExternalDataAccess.GetDatabaseSchema in :line 0
I've been searching and trying different solutions and have not found an answer yet. So, I'm turning to everyone here. I need another set of eyes on this so I can figure out what I'm doing wrong.
My first BIML file has my connection setup to World Wide Importers on my local box.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<## template language = "C#" tier="0" #>
<Connections>
<OleDbConnection
Name="src"
ConnectionString="Data Source=localhost\SQL16;Initial Catalog=WorldWideImporters;Provider=SQLNCLI11.1;Integrated Security=SSPI;"
CreateInProject = "true">
</OleDbConnection>
</Connections>
<Databases>
<Database Name="src" ConnectionName = "src" />
</Databases>
Second BIML file is what is throwing the error
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<## template language = "C#" tier = "1" #>
<## import namespace="Varigence.Biml.CoreLowerer.SchemaManagement" #>
<# var srcDB = RootNode.OleDbConnections["src"]; #>
<# var WWIdb = srcDB.GetDatabaseSchema(ImportOptions.ExcludeViews); #>
<Packages>
<# foreach (var table in WWIdb.TableNodes) { #>
<Package Name="<#=table.Schema#>_<#=table.Name#>" ConstraintMode="Linear">
<Tasks>
<Dataflow Name="DF Copy <#=table.Name#>">
</Dataflow>
</Tasks>
</Package>
<# } #>
</Packages>
</Biml>
That, misleading, error surfaces from the call to GetDatabaseSchema I say it's misleading because the root problem is that srcDB is null. See for yourself by using this code in your second Biml file.
<## import namespace="System.Windows.Forms" #>
<## assembly name= "C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Windows.Forms.dll" #>
<# var srcDB = RootNode.OleDbConnections["ConnectionDoesNotExist"]; #>
<#
if (srcDB == null)
{
MessageBox.Show("It's null");
}
else
{
MessageBox.Show("It's not null - {0}", srcDB.Name);
}
#>
Root problem
You are access an object in the connections collection that doesn't exist - probably because while you have your tiering correct, you need to "include" all the files when you build.
How do you resolve this?
If you're using BimlExpress or BIDS Helper then you simply need to select both file1.biml and file2.biml in the solution menu and right click to generate package.
If you are using Mist/BimlStudio, then I would just right click on file1.biml and change that to Convert to Live BimlScript.
In SoapUI, I have a JDBC Test Step that returns the following data:
<Results>
<ResultSet fetchSize="128">
<Row rowNumber="1">
<ID>1</ID>
<NAME>TestName1</NAME>
<DESCRIPTION/>
<TYPE>Bool</TYPE>
<ISPRODUCTTAG>true</ISPRODUCTTAG>
<ISLOCATIONTAG>false</ISLOCATIONTAG>
<SUBSECTION>Default Sub Section</SUBSECTION>
<SECTION>Default Section</SECTION>
<SUBGROUP>Default Sub Group</SUBGROUP>
<GROUP>Default Group</GROUP>
</Row>
<Row rowNumber="2">
<ID>2</ID>
<NAME>TestName2</NAME>
<DESCRIPTION/>
<TYPE>Bool</TYPE>
<ISPRODUCTTAG>true</ISPRODUCTTAG>
<ISLOCATIONTAG>false</ISLOCATIONTAG>
<SUBSECTION>Default Sub Section</SUBSECTION>
<SECTION>Default Section</SECTION>
<SUBGROUP>Default Sub Group</SUBGROUP>
<GROUP>Default Group</GROUP>
</Row>
</Row>
</ResultSet>
I have an REST API XML Response that contains the following data:
<ArrayOfTagInfo>
<TagInfo id="1" name="TestName1" type="Bool" isProductTag="true" isLocationTag="false" subsection="Default Sub Section" section="Default Section" subgroup="Default Sub Group" group="Default Group"/>
<TagInfo id="2" name="TestName2" type="Bool" isProductTag="true" isLocationTag="false" subsection="Default Sub Section" section="Default Section" subgroup="Default Sub Group" group="Default Group"/>
</ArrayOfTagInfo>
I would like to be able to compare(assert) both the Database Values and the Response Values (response can be in XML or JSON depending on the Request Accept Header) using groovy arrays if possible as the data returned from the database can be very large.
Can anyone help?
You can design your test case with the following test steps:
jdbc step
json step
groovy script step
The idea/sudo code here in groovy script step is that
get the response of jdbc step
get the response of json step
build the data in the form of objects, so that it would be easy for comparison
store the objects in list, one for jdbc and another one json
sort both the lists, to make sure comparing the same data
compare that both the list are same.
Here goes the groovy script:
/**
* Model object for comparing
*/
#groovy.transform.Canonical
class Model {
def id
def name
def type
def isProductTag
def isLocationTag
def subSection
def section
def subGroup
def group
/**
* this will acception jdbc row
* #param row
* #return
*/
def buildJdbcData(row) {
row.with {
id = ID
name = NAME
type = TYPE
isProductTag = ISPRODUCTTAG
isLocationTag = ISLOCATIONTAG
subSection = SUBSECTION
section = SECTION
subGroup = SUBGROUP
group = GROUP
}
}
/**
* this will accept the json TagInfo
* #param tagInfo
* #return
*/
def buildJsonData(tagInfo){
id = tagInfo.#id
name = tagInfo.#name
type = tagInfo.#type
isProductTag = tagInfo.#isProductTag
isLocationTag = tagInfo.#isLocationTag
subSection = tagInfo.#subsection
section = tagInfo.#section
subGroup = tagInfo.#subgroup
group = tagInfo.#group
}
}
/**
* Creating the jdbcResponse from the response received, using fixed value for testing
* If you want, you can assign the response received directly using below instead of current and make sure you replace the step name correctly
* def jdbcResponse = context.expand('${JdbcStepName#Response}')
*/
def jdbcResponse = '''<Results>
<ResultSet fetchSize="128">
<Row rowNumber="1">
<ID>1</ID>
<NAME>TestName1</NAME>
<DESCRIPTION/>
<TYPE>Bool</TYPE>
<ISPRODUCTTAG>true</ISPRODUCTTAG>
<ISLOCATIONTAG>false</ISLOCATIONTAG>
<SUBSECTION>Default Sub Section</SUBSECTION>
<SECTION>Default Section</SECTION>
<SUBGROUP>Default Sub Group</SUBGROUP>
<GROUP>Default Group</GROUP>
</Row>
<Row rowNumber="2">
<ID>2</ID>
<NAME>TestName2</NAME>
<DESCRIPTION/>
<TYPE>Bool</TYPE>
<ISPRODUCTTAG>true</ISPRODUCTTAG>
<ISLOCATIONTAG>false</ISLOCATIONTAG>
<SUBSECTION>Default Sub Section</SUBSECTION>
<SECTION>Default Section</SECTION>
<SUBGROUP>Default Sub Group</SUBGROUP>
<GROUP>Default Group</GROUP>
</Row>
<!--added 3rd row for testing the failure -->
<Row rowNumber="3">
<ID>3</ID>
<NAME>TestName3</NAME>
<DESCRIPTION/>
<TYPE>Bool</TYPE>
<ISPRODUCTTAG>false</ISPRODUCTTAG>
<ISLOCATIONTAG>false</ISLOCATIONTAG>
<SUBSECTION>Default Sub Section3</SUBSECTION>
<SECTION>Default Section3</SECTION>
<SUBGROUP>Default Sub Group3</SUBGROUP>
<GROUP>Default Group3</GROUP>
</Row>
</ResultSet>
</Results>'''
/**
* Creating the jsonResponse from the response received, using fixed value for testing
* If you want, you can assign the response received directly using below instead of current and make sure you replace the step name correctly
* def jsonResponse = context.expand('${JsonStepName#Response}')
*/
def restResponse = '''
<ArrayOfTagInfo>
<TagInfo id="1" name="TestName1" type="Bool" isProductTag="true" isLocationTag="false" subsection="Default Sub Section" section="Default Section" subgroup="Default Sub Group" group="Default Group"/>
<TagInfo id="2" name="TestName2" type="Bool" isProductTag="true" isLocationTag="false" subsection="Default Sub Section" section="Default Section" subgroup="Default Sub Group" group="Default Group"/>
<!--added 3rd row for testing the failure -->
<TagInfo id="3" name="TestName3" type="Bool" isProductTag="true" isLocationTag="false" subsection="Default Sub Section" section="Default Section" subgroup="Default Sub Group" group="Default Group"/>
</ArrayOfTagInfo>'''
//Parsing the jdbc and build the jdbc model object list
def results = new XmlSlurper().parseText(jdbcResponse)
def jdbcDataObjects = []
results.ResultSet.Row.each { row ->
jdbcDataObjects.add(new Model().buildJdbcData(row))
}
//Parsing the json and build the json model object list
def arrayOfTagInfo = new XmlSlurper().parseText(restResponse)
def jsonDataObjects = []
arrayOfTagInfo.TagInfo.each { tagInfo ->
jsonDataObjects.add(new Model().buildJsonData(tagInfo))
}
//sorting the Data before checking for equality
jdbcDataObjects.sort()
jsonDataObjects.sort()
if (jdbcDataObjects.size() != jsonDataObjects.size()) {
System.err.println("Jdbc resultset size is : ${jdbcDataObjects.size()} and Json result size is : ${jsonDataObjects.size()}")
}
assert jdbcDataObjects == jsonDataObjects, "Comparison of Jdbc and Json data is failed"
In the above, 3rd row is not matching, so assert will throw the following error:
Caught: java.lang.AssertionError: Comparison Failed. Expression: (jdbcDataObjects == jsonDataObjects). Values: jdbcDataObjects = [Default Group3, Default Group, Default Group], jsonDataObjects = [Default Group, Default Group, Default Group]
java.lang.AssertionError: Comparison Failed. Expression: (jdbcDataObjects == jsonDataObjects). Values: jdbcDataObjects = [Default Group3, Default Group, Default Group], jsonDataObjects = [Default Group, Default Group, Default Group]
at So31472381.run(So31472381.groovy:104)
Process finished with exit code 1
If you remove 3rd (from both responses), then you do not see any error, which is indication of successful comparison of jdbc and json responses.
Note that groovy script is available in both free and pro version of SoapUI. So this solution works for both the editions of SoapUI.
If you have SoapUI-Pro, you should be able to accomplish all this with no Groovy.
Make the REST call to retrieve all your data.
Start a DataSource step that parses the XML.
Make a JDBC call that select the correct ID of the row you want to verify. Make all the assertions in here.
Loop back to #2.