How to get message data from the human task event listener in wso2bps - listener

When a human task instance is created, i can get task information in the event listener, like task id, task subject, task owner, etc. Sample code is here. But how can i get the task message data(or so called payload)? I tried to use the task id to query database in the listener:
select * from ht_message where message_type='INPUT' and task_id = 7001
But no results returned. In fact the task has not been stored into database at this point.
I also tried to sleep the current thread for several seconds to waiting for the task data been saved to database. But it turns out the save operation was done after the listener is finished.

Solved this by modifying the source code.
Add a field called "body" and its getter/setter method to TaskInfo.java
Add a getData() method to interface MessageDAO.java
Implement getData() in the Message.java
Populate message data to task info in CommonTaskUtil.java
Then in the task lister i can get the message data by
String messageData = taskEventInfo.getTaskInfo().getBody();

Related

Running a Function in the background while asking for more inputs

I am creating a little data processing script using selenium. Where I input my values and it runs a function to do the task on a website. I would like to queue inputs so that I can enter the new values while it works on the old ones.
while customername != 1:
print("Customer name")
customername = input()
print("Credit amount")
creditamount = input()
addcredit(driver, customername, creditamount)
How would I get the function addcredit() to run while the loop continues and asks me for the next set of inputs?
Thank you all!
So after a bit more research, I used Threading.
p1 = threading.Thread(target=addcredit, args=(driver, customername, creditamount))
p1.start()
this is allowing my script to run as intended.. where it starts the action and then allows me to type more data in to run the action again. from my understanding when the function called in the second thread sleeps it bounces back to the first thread and continues on. someone, please correct me if I am wrong.

Captuing runtime for each task within a dataflow in SSIS2012

In my SSIS package I have a dataflow that looks something like this.
My requirement is to log the end time of each flatfile destination (Or the time when each of the flat files is created) , in a SQL server table. To be more clear, there will be one row per flatfile in the log table. Is there any simple way(preferably) to accomplish this? Thanks in advance.
Update: I ended up using a script task after the dataflow and read the creation time of each of the file created in the dataflow. I also used same script task to insert logs into the table, just to keep things in one place. For details refer the post masked as answer.
In order to get the accurate date and timestamp of each flat file created as the destination, you'll need to create three new global variables and set up a for-each loop container in the control flow following your current data flow task and then add to the for-each loop container a script task that will read from one flat file at a time the date/time information. That information will then be saved to one of the new global variables that can then be applied in a second SQL task (also in the for-each loop) to write the information to a database table.
The following link provides a good example of the steps you'll need to apply. There are a few extra steps not applicable that you can easily exclude.
http://microsoft-ssis.blogspot.com/2011/01/use-filedates-in-ssis.html
Hope this helps.
After looking more closely at the toolbox, I think the best way to do this is to move each source/destination pairing into its own dataflow and use the OnPostExecute event of each dataflow to write to the SQL table.
Wanted to provide more detail to #TabAlleman's approach.
For each control flow task with a name like Bene_hic, you will have a source file and a destination file.
On the 'Event Handlers' tab for that executable (use the drop-down list,) you can create the OnPostExecute event.
In that event, I have two SQL tasks. One generates the SQL to execute for this control flow task, the second executes the SQL.
These SQL tasks are dependent on two user variables scoped in the OnPostExecute event. The EvaluateAsExpression property for both is set to True. The first one, Variable1, is used as a template for the SQL to execute and has a value like:
"SELECT execSQL FROM db.Ssis_onPostExecute
where stgTable = '" + #[System::SourceName] + "'"
#[System::SourceName] is an SSIS system variable containing the name of the control flow task.
I have a table in my database named Ssis_onPostExecute with two fields, an execSQL field with values like:
DELETE FROM db.TableStats WHERE TABLENAME = 'Bene_hic';
INSERT INTO db.TableStats
SELECT CreatorName ,t.tname, CURRENT_TIMESTAMP ,rcnt FROM
(SELECT databasename, TABLENAME AS tname, CreatorName FROM dbc.TablesV) t
INNER JOIN
(SELECT 'Bene_hic' AS tname,
COUNT(*) AS rcnt FROM db.Bene_hic) u ON
t.tname = u.tname
WHERE t.databasename = 'db' AND t.tname = 'Bene_hic';
and a stgTable field with the name of the corresponding control flow task in the package (case-sensitive!) like Bene_hic
In the first SQL task (named SQL,) I have the SourceVariable set to a user variable (User::Variable1) and the ResultSet property set to 'single row.' The Result Set detail includes a Result Name = 0 and Variable name as the second user variable (User::Variable2.)
In the second SQL task (exec,) I have the SQLSourceType property set to Variable and the SourceVariable property set to User::Variable2.
Then the package is able to copy the data in the source object to the destination, and whether it fails or not, enter a row in a table with the timestamp and number of rows copied, along with the table name and anything else you want to track.
Also, when debugging, you have to run the whole package, not just one task in the event. The variables won't be set correctly otherwise.
HTH, it took me forever to figure all this stuff out, working from examples on several web sites. I'm using code to generate the SQL in the execSQL field for each of the 42 control flow tasks, meaning I created 84 user variables.
-Beth
The easy solution will be:
1) drag the OLE DB Command from the tool box after the Fatfile destination.
2) Update Script to update table with current date when Flat file destination is successful.
3) You can create a variable (scope is project) with value systemdatetime.
4) You might have to create another variable depending on your package construct if Success or fail

SSIS, Foreach Loop, Event Handler when it writes 0 records due to file containing zero records

I have a foreach loop container in my SSIS package. the Data Flow task imports xls files from a certain directory. If the xls file contains no data, the package shows in the log that it wrote 0 records. This is fine; however, I would like to envoke an event handler that will notify the user (email) or my user interface will pop up a message...that no records were processed.
Is this possible? If so, I wasn't sure what Event Handler to choose and how to configure properly.
Create an integer variable.
Place a rowcount task between the source and destination of the dataflow, and map the step to the variable.
Create an email task and connect the dataflow task to it with a
"Success" constraint and an expression evaluation.
Insert the following value into the expression: " == 0".
This will send an email only when the dataflow results in zero
records.

Avoiding race conditions for a custom get_or_create in Django?

Can anyone advise on the following problem:
I have a custom get_or_create method, which checks multiple fields and does some fancy stuff upon creation:
def fancy_get_or_create(name):
object = self.fancy_get(name)
if not object:
object = self.fancy_create(name)
return object
def fancy_get(name):
return self.filter(Q(name=name) | Q(alias=name)).first()
def fancy_create(name):
name = self.some_preprocessing(name)
return self.create(name=name, alias=name)
There's a race condition, where one request will check to see if the object exists, find nothing, and start creating it. Before that request finishes creating the object, another request comes in looking for the same object, finds, nothing, and begins creating the new object. This request will fail because the database has some uniqueness constraints (the previous request had just created the object, so the second request will fail).
Is there any way to prevent request 2 from querying the database until request 1 has finished? I was reading about transaction management and it did not seem like the solution, since the issue is not partial updates (which would suggest an atomic transaction), but rather the need to make the second request wait until the first has finished.
Thanks!
Update:
Here's what I went with:
try:
return self.fancy_get(name) or self.fancy_create(name)
except IntegrityError:
return self.fancy_get(name)
There are two viable solutions:
Use a mutex so only one process can access the fancy_get_or_create
function at the same time.
Capture the error thrown by the database and do something instead: ignore
that create, update the row instead of creating it, throw an
exception, etc.
Edit: another solution might be doing an INSERT IGNORE instead of just an INSERT. https://dev.mysql.com/doc/refman/5.1/en/insert.html

CakePHP Accessing Dynamically Created Tables?

As part of a web application users can upload files of data, which generates a new table in a dedicated MySQL database to store the data in. They can then manipulate this data in various ways.
The next version of this app is being written in CakePHP, and at the moment I can't figure out how to dynamically assign these tables at runtime.
I have the different database config's set up and can create the tables on data upload just fine, but once this is completed I cannot access the new table from the controller as part of the record CRUD actions for the data manipulate.
I hoped that it would be along the lines of
function controllerAction(){
$this->uses[] = 'newTable';
$data = $this->newTable->find('all');
//use data
}
But it returns the error
Undefined property:
ReportsController::$newTable
Fatal error: Call to a member function
find() on a non-object in
/app/controllers/reports_controller.php
on line 60
Can anyone help.
You need to call $this->loadModel('newTable') to initialize it properly. Cake needs to initialize $this->newTable properly, and call all the callbacks.
Of course, you don't need $this->uses[] = 'newTable';, that doesn't do anything except add another value to the $uses array.
try:
function controllerAction() {
$data = ClassRegistry::init('ModelNameForNewTable')->find('all');
}
If your table is called 'new_tables', your model name should be 'NewTable'