Change Task triggering but not reflecting the field values on OIM process Form - identity

When I am changing any attributes of user from User Profile its changing those attributes in OIM and target also. The change Tasks are getting called when checked in the Resource History, but the updated data is not shown in the OIM Process Form.
What could be the cause of the same? and what could be the possible solution? Where should I debug for it?

Some things to double check
In the process form, does the Change Task exist, does it simply copy the user attribute to the process data field?
Does the change task exist in the Lookup.USR_PROCESS_TRIGGERS table?
Is there an XX Updated task that was triggered? how was that triggered, and does the updated task pull the data from the user definition or the process form?
Here is some SQL to get Resource history
-- Grabs Resource History by ORC_KEY
SELECT osi.sch_key
, osi.orc_key
, osi.mil_key
, to_char(osi.osi_assigned_date,'dd-mon-rrrr hh24:mi:ss') as OSI_ASSIGNED_DATE
, to_char(OSI.OSI_UPDATE,'dd-mon-rrrr hh24:mi:ss') as OSI_UPDATED
, osi.rsc_key, sch.sch_status, sch.sch_data, mil.mil_name, SCH.sch_update
, to_char(sysdate,'dd-mon-rrrr hh24:mi:ss') as NOW
FROM osi, sch, mil
WHERE osi.sch_key = sch.sch_key
AND osi.mil_key = mil.mil_key
AND osi.orc_key in ('<PUT_ORC_KEY_HERE>')
ORDER BY sch.sch_update DESC;

Related

How to update multiple records in same table using .AfterUpdate data macro without error "A data macro resource limit was hit."

I have a table tblItems with a list of inventory items. The table has many columns to describe these items, including columns for SupplierName, SupplierOrderNumber and PredictedArrivalDate.
If I order several new items from a supplier, I will record each item separately in the table with the same supplier name, order number and a predicted arrival date.
I would like to add a data macro, so that if I update the PredictedArrivalDate for one record, the value will be copied to the PredictedArrivalDate column of other records/items with the same SupplierName AND SupplierOrderNumber.
The closest I've got is:
SetLocalVar (MySupplierName, [SupplierName])
SetLocalVar (MySupplierOrderNumber , [SupplierOrderNumber ])
SetLocalVar (MyPredictedArrivalDate, [PredictedArrivalDate])
For Each Record in tblItems
Where Condition = [SupplierOrderNumber] Like [MySupplierOrderNumber] And [SupplierName] Like [MySupplierName] And [PredictedArrivalDate]<>[MyPredictedArrivalDate]
Alias OtherRecords
EditRecord
SetField ([OtherRecords].[PredictedArrivalDate], [MyPredictedArrivalDate])
End EditRecord
However, when I run this, only 5 records update, and the error log reports error -20341:
"A data macro resource limit was hit. This may be caused by a data
macro recursively calling itself. The Updated() function may be
used to detect which field in a record has been updated to help
prevent recursive calls."
How can I get this working?
I'm not one for using macro's to do anything, so I'd use VBA and recordsets/an action query to do the updating.
You can call a user-defined function inside a data macro by setting a local var equal to its result.
Access doesn't like data macros triggering themselves (which you are doing, you're using an on update macro and updating fields in the same table on a different record), because there is a risk of accidentally creating endless loops. Looks like you triggered a measure that's made to prevent this. I'd try to avoid that as much as possible.
Note: using user-defined functions inside data macros can cause problems when you're linking to the table from outside of Access (via ODBC for example).
This isn't a good solution (it's not a data macro), but it does work as a temporary fix.
I created an update query called "updatePredictedArrivalDate":
PARAMETERS
ItemID Long,
MyPredictedArrivalDate DateTime,
MySupplierName Text ( 255 ),
MySupplierOrderNumber Text ( 255 );
UPDATE tblItems
SET tblItems.PredictedArrivalDate = [MyPredictedArrivalDate]
WHERE (((tblItems.SupplierName) = [MySupplierName])
AND ((tblItems.SupplierOrderNumber) = [MySupplierOrderNumber])
AND ((tblItems.ID) <> [ItemID]));
On the PredictedArrivalDate form field .AfterUpdate event, I then added this macro:
IF [PredictedArrivalDate].[OldValue]<>[PredictedArrivalDate] Or [PredictedArrivalDate]<>""
OpenQuery (updatePredictedArrivalDate, Datasheet, Edit, [ID], [PredictedArrivalDate], [SupplierName], [SupplierOrderNumber])
I now have to remember to add this .AfterUpdate event to any other forms I create that amend that particular field.
If anyone has a better solution, please let me know.

What is the best way to update another table based on a certain condition ON UPDATE?

I'm having two tables subscription and subscription_event. A subscription_event can be one of the following types:
public enum SubscriptionEventType {
CREATED,
CANCELED,
CHARGED_SUCCESSFULLY,
CHARGED_UNSUCCESSFULLY,
EXPIRED,
TRIAL_STARTED,
TRIAL_ENDED,
WENT_ACTIVE, // Subscription went active and can be charged from now on.
WENT_PAST_DUE;
public Long getValue() {
return this.ordinal() + 1L;
}
}
What I want to do is to keep the state of subscription to the most recent event. The problem: Those events do not come in correct order. E.g. it is possible to get a CHARGED_SUCCESSFULLY event before a WENT_ACTIVE event.
So there are several way how I can accomplish what I need. First of all I can check the condition in my application layer and always set that "most recent" state based on the timestamp of the event.
Long subscriptionId = lastRecordedEvent.getSubscriptionId();
if(event.getTimestamp() > lastRecordedEvent.getTimestamp()) {
// Since the current event is more recent than all other events
// we also have to update the subscription state
subscriptionRepository.updateState(subscriptionId, event.getTimestamp());
}
However, I do not want to do this in my application layer. Another solution would be to use a TRIGGER on the subscription_event table and let that on decide whether to update the relevant subscription or not. The reason why I do not go for that just yet is because I know that triggers can be easily forgotten and also be a pain to maintain. Also I know one should take every other option into account before using a TRIGGER but since I am not a SQL/MySQL expert I'm not aware of all my options here.
So what would be the most practicable way to keep subscription up-to-date in this situation?
Insert your event as usual into the table and then execute the following
UPDATE subscriptions set state=events.state
FROM subscriptions inner join events on subscriptions.id = events.subscriptionID
Where events.SubscriptionId = ? and events.Timestamp =
(select max(timestamp) from events where events.SubscriptionId = ?)
You will need to pass parameters for the two ?s to be the subscription id of the event you just inserted
EDIT
An alternative approach is rather than have a status field in the database, create a view for your subscriptions and always query the view instead.
CREATE VIEW vw_subscriptions as
Select s.id, otherfields from subscription, coalesce(e.Status, 1) as status
from subscriptions s left outer join events e on s.id=e.subscriptionId
AND e.timestamp =
(select max(timestamp) from events where subscriptionId=s.id)
If you are worried about forgetting/maintaining the SQL or triggers, document them as comments in your repository functions and maintain all changes to the database as a change script that you store with your source code. That way your changes are all in your source control.

MS Access Lookup Populate

I have recently started managing an Access DB used for reporting. Currently a single row has a 'status' that can be one of many options selected by a dropdown field. When reporting, each of these ~15 statuses rolls up to one of 5 'rollup statuses' which is currently translated via an Excel interface. I would like to add a column to the database table that automatically populates the correct 'rollup status' based on the selected 'status'. I do not know if this is a calculated field, a lookup, etc. as I have very minimal Access knowledge.
For example:
[Status]---->[Rollup Status]
To Be Scheduled----> Planning
TBD---->Planning
Scheduled---->Scheduled
DMM Pending---->Scheduled
EEP Created---->Scheduled
Cleanup Pending---->Complete
Complete---->Complete
If i understand your question correctly perhaps this would work...
In design mode: Add the new column to your table (which i will call 'total') where you want the 'rollup statuses' and call it something like 'rollup_status'
In SQL Query mode:
UPDATE total
SET rollup_status = '1'
WHERE [status] = '2';

Captuing runtime for each task within a dataflow in SSIS2012

In my SSIS package I have a dataflow that looks something like this.
My requirement is to log the end time of each flatfile destination (Or the time when each of the flat files is created) , in a SQL server table. To be more clear, there will be one row per flatfile in the log table. Is there any simple way(preferably) to accomplish this? Thanks in advance.
Update: I ended up using a script task after the dataflow and read the creation time of each of the file created in the dataflow. I also used same script task to insert logs into the table, just to keep things in one place. For details refer the post masked as answer.
In order to get the accurate date and timestamp of each flat file created as the destination, you'll need to create three new global variables and set up a for-each loop container in the control flow following your current data flow task and then add to the for-each loop container a script task that will read from one flat file at a time the date/time information. That information will then be saved to one of the new global variables that can then be applied in a second SQL task (also in the for-each loop) to write the information to a database table.
The following link provides a good example of the steps you'll need to apply. There are a few extra steps not applicable that you can easily exclude.
http://microsoft-ssis.blogspot.com/2011/01/use-filedates-in-ssis.html
Hope this helps.
After looking more closely at the toolbox, I think the best way to do this is to move each source/destination pairing into its own dataflow and use the OnPostExecute event of each dataflow to write to the SQL table.
Wanted to provide more detail to #TabAlleman's approach.
For each control flow task with a name like Bene_hic, you will have a source file and a destination file.
On the 'Event Handlers' tab for that executable (use the drop-down list,) you can create the OnPostExecute event.
In that event, I have two SQL tasks. One generates the SQL to execute for this control flow task, the second executes the SQL.
These SQL tasks are dependent on two user variables scoped in the OnPostExecute event. The EvaluateAsExpression property for both is set to True. The first one, Variable1, is used as a template for the SQL to execute and has a value like:
"SELECT execSQL FROM db.Ssis_onPostExecute
where stgTable = '" + #[System::SourceName] + "'"
#[System::SourceName] is an SSIS system variable containing the name of the control flow task.
I have a table in my database named Ssis_onPostExecute with two fields, an execSQL field with values like:
DELETE FROM db.TableStats WHERE TABLENAME = 'Bene_hic';
INSERT INTO db.TableStats
SELECT CreatorName ,t.tname, CURRENT_TIMESTAMP ,rcnt FROM
(SELECT databasename, TABLENAME AS tname, CreatorName FROM dbc.TablesV) t
INNER JOIN
(SELECT 'Bene_hic' AS tname,
COUNT(*) AS rcnt FROM db.Bene_hic) u ON
t.tname = u.tname
WHERE t.databasename = 'db' AND t.tname = 'Bene_hic';
and a stgTable field with the name of the corresponding control flow task in the package (case-sensitive!) like Bene_hic
In the first SQL task (named SQL,) I have the SourceVariable set to a user variable (User::Variable1) and the ResultSet property set to 'single row.' The Result Set detail includes a Result Name = 0 and Variable name as the second user variable (User::Variable2.)
In the second SQL task (exec,) I have the SQLSourceType property set to Variable and the SourceVariable property set to User::Variable2.
Then the package is able to copy the data in the source object to the destination, and whether it fails or not, enter a row in a table with the timestamp and number of rows copied, along with the table name and anything else you want to track.
Also, when debugging, you have to run the whole package, not just one task in the event. The variables won't be set correctly otherwise.
HTH, it took me forever to figure all this stuff out, working from examples on several web sites. I'm using code to generate the SQL in the execSQL field for each of the 42 control flow tasks, meaning I created 84 user variables.
-Beth
The easy solution will be:
1) drag the OLE DB Command from the tool box after the Fatfile destination.
2) Update Script to update table with current date when Flat file destination is successful.
3) You can create a variable (scope is project) with value systemdatetime.
4) You might have to create another variable depending on your package construct if Success or fail

Check if mySQL record added in the last x seconds

I have a mySQL database and a table where new records for a project are created. Each project created has a "project name" and an event created date (of type DATETIME).
There can be two projects created with the same name, but if they get created by the same user in quick succession, it is safe to assume it was a mistake on the user's part (clicking twice, refreshing the browser when event variables are passed, etc.).
How do I write a SQL statement to check if a record with the same name already exists, it was added in the last 10 seconds? So far I have the following, although I don't know how to check for the last 10 seconds.
select * from projects where user = 'johnsmith' AND projectname = 'test' AND active='y' AND DATE(projectcreatedon) = CURRENT_DATE AND DATEPART() < ....?
replace AND DATE(projectcreatedon) = CURRENT_DATE AND DATEPART() < ....? with:
AND projectcreatedon > (now() - INTERVAL 10 SECOND)
I would suggest not to keep such checks in MySQL because that might not be the perfect way of knowing mistakes because the user might well click the submit or refresh the page after 10 seconds. Instead, put checks in the front-end code to disable clicking the submit button twice or redirect the user to a page where no variables are passed.
But if that isn't what you would like to do, then this might be your query:
SELECT *
FROM `projects`
WHERE `user` = 'johnsmith'
AND `projectname` = 'test'
AND `active`='y'
AND TIMESTAMPDIFF(SECOND, projectcreatedon, now()) > 10;
You're trying to fix the problem in the wrong way. Why not eliminate the problem at the source? Make it impossible for the user to create these two projects successively.
If your app makes it possible for a user to submit a form multiple times via refresh, consider using a redirect after the GET/POST variables have been processed.
Furthermore, use simple client-side tricks to disable the submit button after it has been clicked once. You can accomplish this with a very small amount of jQuery