Creating a NService Bus Ticket using SSIS Package - ssis

Is there a way to create a NService Bus ticket within SSIS package.
I am new at using SSIS packages (MSSQL 2012) and was thinking of serializing an NServiceBus ticket using a SCRIPT TASK and sending it to a MSMQ queue using a Messaging Task.
Am I thinking in the right direction?

We have done this internally, you just have to make sure you get the serialization right. You also have to be aware of what gets sent in the message label. Here is what is working on v2.5 which may have to be modified for 3.x:
String label = String.Format("<CorrId></CorrId><WinIdName>{0}\\Administrator</WinIdName>", Environment.MachineName);
Message message = new Message();
XmlSerializer serializer = new XmlSerializer(typeof(IMyMessage));
serializer.Serialize(message.BodyStream, command);
message.Label = label;
MessageQueue q = new MessageQueue(queueName);
if (q != null )
{
q.Send(message, MessageQueueTransactionType.Single);
}

Related

API Testing / REST Assured Automation Testing Advice and or suggestions

Really looking for some practical advice and general guidance.
Below is the current scenario.
I have an excel document each row would be considered a test with inputs.
There would be hundreds if not thousands of rows.
Lets for example say Row1 would look like
Col1----------|Col2---------------|Col3
TestingUser|TestingSurname|1980/01/01
This needs to me mapped to a JSON object then sent / POST to an API end point.
I then need to assert the data that is coming back to make sure it’s the correct values.
The tools I have looked at is:
ReadyAPI
rest-assured.io
Would you recommend any other tool or framework for this type of testing.
If you have worked with something and you can provide an example that would be great.
I wouldn't be able to provide recommendation as i haven't worked on RestAssured.However below are few advantages of ReadyAPI:
Learning curve is shallow,any Tester will be able to build test
case without dependency on any programming language. ReadyAPI has
inbuild feature to read data from different datasources(DB, XML,
json,csv,excel etc.) and invoke REST endpoint by passing these
fields to Header,query and Json Body of the end point.
The response for each call can be dumped to a file using DataSink option for a
test step for each of the request calls made for the records from the file.
Tool is structured to easily build test Cases with multiple test
Steps. It more like drag and drop to build your test cases.Hierarchy
is Project -> Test Suite -> Test Case -> Test Step.
Easy integration with Jenkins CI/CD pipeline using testRunner with wide
variety of test reporting capabilities. Test reports are available as Allure,
jasper reports, junit Style reporting.
For more technical testers who need more control can use Groovy,javascript
language to build frameworks.
VirtServer and LoadUI are other tools by SmartBear that can be used to mock
services and run performance tests as desired.
I have an important comment to make here, if the file is huge(even 1000 lines) i have seen Ready API struggling as the tool does the heavylifting in the back. Hence would recommend to use groovy script utilizing Java API's for any file operations.
Ok so I have created a class using velocity as a json template engine.
I have created a test and within that test i have a normal java loop.
This will loop through the entire xls, map values and and post to the API.
This is all working as expected.
The problem is the runner displays
Default Suite
Total tests run: 1, Passes: 0,
However the loop does run x amount of times.
How can i update it the when i execute the test its shows total tests run 10 or the same amount that is from the loop.
Hope this makes sense
#Test
public void generatePostData() throws IOException {
Workbook wb = WorkbookFactory.create(new File("data\\sc1.xlsx"));
Sheet sheet = wb.getSheetAt(0);
for (int i = 1; i < 10; i++) {
//Get Excel Data
Cell testNumber = sheet.getRow(i).getCell(1);
System.out.println(testNumber.getNumericCellValue());
//Velocity
VelocityEngine ve = new VelocityEngine();
ve.init();
//get the template
Template t = ve.getTemplate("post.json");
//create context and add data
VelocityContext context = new VelocityContext();
//map data
context.put("tpltestNumber", testNumber);
//render to stringWriter
StringWriter writer = new StringWriter();
t.merge(context, writer);
baseURI = "someURL";
Response response =
given()
.contentType("application/json")
.body(String.valueOf(writer))
.when()
.post()
.then()
.assertThat()
.statusCode(200)
.extract()
.response();
}
}
This is the answer to the question asked in the answers sesction by the reporter of the main question. (How to get the executed excel row count, for total executed test case count)
For that you have to pass the data using a method with DataProvider annotation.
TestNG documentation
DataProvider in TestNG
#DataProvider(name = "dp")
private Object[][] dataProvider() {
Workbook wb;
Sheet sheet = null;
Object[][] excelRowArray = new Object[10][]; //this 10 the row count in the excel file
try {
wb = WorkbookFactory.create(new File("data\\sc1.xlsx"));
sheet = wb.getSheetAt(0);
} catch (IOException e) {
e.printStackTrace();
}
for (int i = 1; i < 10; i++) {// Here 10 is the row count in the excel sheet
//Get Excel Data row by row
Cell testNumber = sheet.getRow(i).getCell(1);
System.out.println(testNumber.getNumericCellValue());
// Create a object array with the values taken from a singe excel row
Object[] excelRow = new Object[]{testNumber};
// Add the created object array to the 'excelRowArray'
excelRowArray[i - 1] = excelRow;
}
return excelRowArray;
}
#Test(dataProvider = "dp")
public void generatePostData(Object[] excelRow) {
// Here single excelRow will be passed each time.
// And this will run till all object[] in excelRowArray are finished.
// Total tests executed will be the number of 'excelRow' object arrays in
// excelRowArray. (Or excel row count in the sheet)
//Velocity
VelocityEngine ve = new VelocityEngine();
ve.init();
//get the template
Template t = ve.getTemplate("post.json");
//create context and add data
VelocityContext context = new VelocityContext();
//map data
context.put("tpltestNumber", excelRow); // Here excelRow is used as the value
//render to stringWriter
StringWriter writer = new StringWriter();
t.merge(context, writer);
String baseURI = "someURL";
Response response =
given()
.contentType("application/json")
.body(String.valueOf(writer))
.when()
.post()
.then()
.assertThat()
.statusCode(200)
.extract()
.response();
}

Is there a work-a-round to serializing some Microsoft Graph Entities with interfaces like Domain

I wanted to generate a test program to execute against our client tenants to verify we could handle all the data our new Microsoft graph app collects. My plan was to serialize the data using
XmlSerializer serializer = new XmlSerializer(typeof(List<T>));
It failed on the first entity I tried, Microsoft.Graph.Domain ( in this case with the error
Cannot serialize member Microsoft.Graph.Entity.AdditionalData of type ... because it is an interface.
A search on stack overflow found suggestions to decorate the problematic class property with XmlIgnore so XmlSerializer will ignore it, others recommended implementing a new IXmlSerializer. One post seemed to propose using serializing to XAML.
Open to a better way to collect real customer data which I can import into my unit tests? As a developer I do not have direct access to customer accounts.
Does anyone have other suggestions on how to serialize Microsoft Graph Entities.
I replaced my XmlSerializer with a Json one.
public void SerializeObjectsToJson<T>(List<T> serializableObjects)
{
var jsonStr = JsonConvert.SerializeObject(serializableObjects);
}
public List<T> DeSerializeObjectsFromXml<T>()
{
TextReader textReader = new StreamReader(fqpathname, Encoding.UTF8);
var jsonStr = textReader.ReadToEnd();
data = JsonConvert.DeserializeObject<List<T>>(jsonStr);
}
This all seems to work with Domain, User, SubscribedSkus, Organization, etc.

quickfixj Integration with External OMS

I am doing a development to integrate a non Java OMS system with QuickFIX/J to send buy/sell orders to multiple brokerage systems .
I have written the belog logic to send the messages
I have written this under main function which is in the same class created by implementing Application "public class Initiator implements Application"
InputStream inp = InitiatorSocket.class.getResourceAsStream("test.cfg");
SessionSettings sessionSetting = new SessionSettings(inp);
Application myApp = new Initiator();
FileStoreFactory factory = new FileStoreFactory(sessionSetting);
ScreenLogFactory sfactory = new ScreenLogFactory(sessionSetting);
DefaultMessageFactory defaultMsgFactory = new DefaultMessageFactory();
initiator = new SocketInitiator(myApp, factory, sessionSetting,sfactory,defaultMsgFactory);
initiator.start();
SessionID sessionId = initiator.getSessions().get(0);
I am using the below code to send messages after continuously listening a directory using while Loop.
while(true)
{
readFilefromSrcDirectory();
prepareFixMessage();
Session.sendToTarget(fixMessage, sessionId);
}
My above code is getting executed while debugging but when I run it normally, the Session.sendToTarget(fixMessage, sessionId); and other file read related logic which is next to initiator.start(); is not getting executed.
Kindly note that the same above code is getting executed if we add some console print statements such as System.out.print("Test");
Please help me.
Are your test.cfg settings between debug and run different? I would add console print statements everywhere and work out exactly where the runtime is failing.

SQL Server Express 2008 Merge Replication using RMO causes "An exception occurred while executing a Transact-SQL statement or batch." error?

I am trying to create a Merge Replication using RMO Programming which i got from here!
string publisherName = "DataSourceName";
string publicationName = "AdvWorksSalesOrdersMerge";
string publicationDbName = "AdventureWorksDW2008R2";
ReplicationDatabase publicationDb;
MergePublication publication;
// Create a connection to the Publisher.
ServerConnection conn = new ServerConnection(publisherName);
try
{
//Connect to the Publisher.
conn.Connect();
// Enable the database for merge publication.
publicationDb = new ReplicationDatabase(publicationDbName, conn);
if (publicationDb.LoadProperties())
{
if (!publicationDb.EnabledMergePublishing)
{
publicationDb.EnabledMergePublishing = true;
}
}
else
{
// Do something here if the database does not exist.
throw new ApplicationException(String.Format(
"The {0} database does not exist on {1}.",
publicationDb, publisherName));
}
// Set the required properties for the merge publication.
publication = new MergePublication();
publication.ConnectionContext = conn;
publication.Name = publicationName;
publication.DatabaseName = publicationDbName;
// Enable precomputed partitions.
publication.PartitionGroupsOption = PartitionGroupsOption.True;
//Specify the Windows account under which the Snapshot Agent job runs.
// This account will be used for the local connection to the
// Distributor and all agent connections that use Windows Authentication.
publication.SnapshotGenerationAgentProcessSecurity.Login = userid;
publication.SnapshotGenerationAgentProcessSecurity.Password = password;
//Explicitly set the security mode for the Publisher connection
// Windows Authentication (the default).
publication.SnapshotGenerationAgentPublisherSecurity.WindowsAuthentication = true;
//Enable Subscribers to request snapshot generation and filtering.
publication.Attributes |= PublicationAttributes.AllowSubscriberInitiatedSnapshot;
publication.Attributes |= PublicationAttributes.DynamicFilters;
// Enable pull and push subscriptions.
publication.Attributes |= PublicationAttributes.AllowPull;
publication.Attributes |= PublicationAttributes.AllowPush;
if (!publication.IsExistingObject)
{
//Create the merge publication.
publication.Create();
// Create a Snapshot Agent job for the publication.
publication.CreateSnapshotAgent();
}
else
{
throw new ApplicationException(String.Format(
"The {0} publication already exists.", publicationName));
}
}
catch (Exception ex)
{
//Implement custom application error handling here.
throw new Exception(String.Format("The publication {0} could not be created.", publicationName), ex);
}
finally
{
conn.Disconnect();
}
but at this line
publicationDb.EnabledTransPublishing = true;
i am getting error -" An exception occurred while executing a Transact-SQL statement or batch."
So please help me out from this problem ..
waiting for your answers..
You Probably have you answer by now but for those who might be asking the same question. Its because you are using an express version of SQL Server and Publisher/distributors cannot be created in any version of SQl Server Express.
The instance you have in your code there is not a valid instance hence the exception is thrown.
take a look at:
http://msdn.microsoft.com/en-us/library/ms151819(v=sql.105).aspx
and the lines that say
Microsoft SQL Server 2008 Express can serve as a Subscriber for all types of replication, providing a convenient way to distribute data to client applications that use this edition of SQL Server. When using SQL Server Express in a replication topology, keep the following considerations in mind:
SQL Server Express cannot serve as a Publisher or Distributor. However, merge replication allows changes to be replicated in both directions between a Publisher and Subscriber.
Blockquote

groovyPagesTemplateEngine failing in quartz job with error

We are using Quartz scheduling in our application to schedule jobs to generate and send self-audit emails.
I am trying to generate the processed emailBody from the email Template using GroovyPagesTemplateEngine.
The emailTemplate is processed properly into EmailBody when the processing does not use Quartz scheduling. But when a job is run using Quartz for emailtemplate processing the
groovyPagesTemplateEngine is failing in quartz job with error
[12:10:55 AM] Mandar: java.lang.IllegalStateException: TemplateEngine not initialised correctly, no [resourceLoader] specified!
this what I am trying to do
def getInfo(){
MockHttpServletRequest servletRequest = new MockHttpServletRequest()
GrailsWebRequest grailsWebRequest = new GrailsWebRequest(servletRequest, new MockHttpServletResponse(), new MockServletContext())
grailsWebRequest.setAttribute(GrailsApplicationAttributes.WEB_REQUEST, grailsWebRequest, 0)
RequestContextHolder.requestAttributes = grailsWebRequest
GroovyPagesTemplateEngine engine = new GroovyPagesTemplateEngine()
StringWriter sw = new StringWriter()
PrintWriter pw = new PrintWriter(sw)
engine.createTemplate('myteplate').make(model).writeTo(pw)
println sw.toString()
return sw.toString()
}
I am aware that the Quartz scheduler does not have a WebRequest associated with it. and I am thinking that the email processing is failing due to this.
How can I process the emailtemplate to generate the emailBody content when the scheduled job is run, and not by logging into the application from UI.
Thanks in advance.
The resourceLoader is not initialized in the groovyPagesTemplateEngine as you are just creating a new instance of it directly. Instead, you should let Spring's dependency injection do the job for you.
Add the following line to your service:
class YourService {
def groovyPagesTemplateEngine
def getInfo(){
GroovyPagesTemplateEngine engine = groovyPagesTemplateEngine
//your code here
}
}
You can try using the steps mentioned in http://www.intelligrape.com/blog/2010/12/27/request-mocking-to-use-groovypagestemplateengine-in-backend-threads/
If you are on Grails 2.0.x, you get a bean named groovyPageRenderer, which can be used outside the context of a web request as well. For more details, http://mrhaki.blogspot.in/2012/03/grails-goodness-render-gsp-views-and.html